Microsoft is planning to offer a privacy-focused version of the ChatGPT chatbot to banks, health care providers, and other large organizations concerned about data leaks and regulatory compliance, according to a report from The Information.
The product, which could be announced “later this quarter,” would run ChatGPT on dedicated servers, separate from the ones used by other companies or other individual users using the versions of ChatGPT that Microsoft is baking into Edge, Windows, and its other products. This would keep sensitive data from being used to train ChatGPT’s language model and could also prevent inadvertent data leaks—imagine a chatbot that revealed information about one company’s product roadmap to another company, just because both companies used ChatGPT.
The catch is that these isolated versions of ChatGPT could cost a lot more to run and use. The report says that the private instances “could cost as much as 10 times what customers currently pay to use the regular version of ChatGPT.”