Microsoft CoPilot Feature Highlights Data Privacy Concerns

In July 2024, CoPilot, Microsoft’s generative AI chatbot, released a new feature, allowing it to access and process a company’s proprietary data stored on OneDrive. This gives Microsoft’s corporate clients a powerful tool for summarizing and analyzing internal data stored within the company’s servers, and it alters the equation that such customers use when considering the advantages of AI in the cloud versus data privacy.

GenAI has been able to analyze corporate data until now as well, but the process was too slow to be viable. Even accessing a single large document could take up to a full day, which meant that more extensive projects, such as learning from a company’s data center or from data in the cloud were off-limits to such tools.

However, now that there is a tool capable of analyzing such large quantities of enterprise data, companies will need to weigh the benefits of such a platform against the potential risks to exposing the data beyond their proprietary servers.

CoPilot OneDrive vs. Data Privacy

When a company shares proprietary data from OneDrive with CoPilot, the information is automatically incorporated into CoPilot’s machine learning module, potentially exposing sensitive data to the AI. Even though Microsoft’s CoPilot offers commercial data protection, it is not clear what the result of ending the contract with Microsoft would be: will the propriety data be deleted? If so, what part of it, and would any remain in the public domain?

“Of course, once Microsoft’s AI ‘knows’ the contents of your company’s internal documents, you’ll be less likely to ever sever your ongoing subscription,” states Mark Hachman, senior editor at PCWorld.  Companies will be less likely to terminate their subscription only to further expose that data to a competing AI module.

Microsoft maintains that the CoPilot OneDrive feature uses an exclusive learning module for that company’s data, never sharing it with its external AI apps. This means that CoPilot should be immune from leaks to the world outside the company’s domain. Nonetheless, Microsoft admits that CoPilot is not encrypted end-to-end, such that it is potentially vulnerable to hacks, and even internally, data privacy could be compromised by exposing it to employees without proper access rights.

Fully Homomorphic Encryption for Private LLM

The most promising innovation on the horizon to secure a company’s privacy from Artificial Intelligence is Fully Homomorphic Encryption. FHE allows applications to perform computation on encrypted data without ever needing to decrypt it.  FHE allows CoPilot to perform analysis exclusively on encrypted data, so that proprietary data is never actually exposed to the learning module.

Chain Reaction is at the forefront of the race to design and produce a processor that will enable real-time processing of Fully Homomorphic Encryption. The processor will enable AI to process data without compromising it, creating a Private LLM (Large Language Model). Corporations will finally be able to benefit from using Artificial Intelligence like CoPilot without the fear that proprietary code and sensitive corporate information will be compromised.

Back to Resource Hub