Microsoft Copilot

Article written by Hananel Livneh, Head of Product Marketing at Adaptive Shield.

Microsoft Copilot can supercharge productivity. The Microsoft 365 GenAI assistant integrates with Word, PowerPoint, Excel, Teams, and other applications within the Microsoft Suite and performs the roles of analyst, copywriter, notetaker, and designer.

Not only that, but it also produces high-quality content in a fraction of the time it would take you to do so. It’s a dream come true for most employees.

However, those benefits come with a significant caveat for enterprises. Can Microsoft Copilot be trusted not to access or share confidential information?

At the core of that question is this one: Is Microsoft Copilot, or any other GenAI assistant, secure?

If you watched the Indy 500 over Memorial Day Weekend, you would already understand that only a skilled operator should be at the wheel when using sensitive, dangerous equipment running at high speed. An inexperienced driver would not have been capable of safely passing Pato O’Ward at over 220 MPH, as Josef Newgarden did.

While no one is at risk of physical harm when using GenAI, the principle is the same. Using hyper-productive tools to generate a continual stream of materials using a vast reservoir of corporate information has a dark side. The ease with which data can leak and fall into the wrong hands is remarkably high.

Data Access is Only a Query Away

Microsoft Copilot generates materials based on data it can access within the Microsoft product suite. Data that is hard to locate can now be correlated with hundreds of data points, and is only a query away.

If the employee doesn’t realize the sensitivity of the response, or trusts Microsoft Copilot without carefully reading through the response, sensitive customer and competitor information can be shared with outsiders.

To borrow once more from our Indy 500 analogy, users need proper guardrails put in place to prevent data GenAI-driven leakage.

Copilot relies on existing Microsoft 365 access controls. If users have broad access to sensitive data, Copilot does as well, and can expose it.

Companies should also label files and folders as sensitive, to prevent Copilot from accessing it.

Any Data is Fair Game for GenAI

There is no question that Microsoft Copilot can improve employee performance. However, organizations that refrain from implementing a security structure around its usage do so at their peril. Copilot access should only be granted to employees who require it for their jobs.

These employees must be properly trained in the risks of sending out materials to external users without reading them carefully.

It bears repeating that Copilot’s access mirrors user access. Employees are often granted wide swaths of permissions.

No one anticipates that an employee will dig into long-lost files stored on a drive and use that information. However, GenAI sees anything accessible as fair game.

To truly prevent data leaks coming through Copilot, admins must do a far more precise job when defining user access and roles to files stored in corporate drives. Otherwise, you will always be at risk of leaking sensitive information.

To summarize, for companies to feel secure using Microsoft Copilot, they need to follow these guidelines:

  • Control access: Implement precise access controls, granting users minimal permissions based on their role
  • Review resources: Audit all resources such as documents and spreadsheets to make sure that sensitive resources are not shared broadly or externally
  • Protect data: Label files and folders as sensitive to limit Copilot access to such data
  • Detect Indicators of Compromise (IOC): Deploy robust threat detection tools to safeguard the environment and detect if anything goes wrong

I hope this article helps you better understand the importance of protecting your Microsoft Copilot and allows you to use it safely.

Learn more about securing GenAi in SaaS

Sponsored and written by Adaptive Shield.

Related Articles:

Frustration grows over Google's AI Overviews feature, how to disable

Stack Overflow suspends user for editing posts in OpenAI protest

Generative AI Security: Preventing Microsoft Copilot Data Exposure

EC-Council to Decrease AI Chasm with Free Cyber AI Toolkit for Members

Microsoft delays Windows Recall amid privacy and security concerns