Microsoft Copilot Known as one of the most powerful productivity tools on the planet.
Copilot is an artificial intelligence assistant built into every one of your Microsoft 365 applications (Word, Excel, PowerPoint, Teams, Outlook, and more). Microsoft's dream is to eliminate the drudgery of daily work and free humans to focus on being creative problem solvers.
Copilot differs from ChatGPT and other AI tools in that it has access to everything you've ever worked on in 365. Copilot instantly searches and compiles data from documents, presentations, emails, calendars, notes, and contacts.
This isinformation securityTeam issues. Copilot has access to all the sensitive data a user has access to, which is often too much. On average, company 10%'s M365 data is open to all employees.
Copilot can also quickly generateNew sensitive data.Before the AI revolution, humans’ ability to create and share data far exceeded our ability to protect it. Just look at data breach trends.generative artificial intelligenceThe fire was doused with kerosene.
When it comes to generative AI as a whole, there's a lot to unpack: model poisoning, hallucinations, deepfakes, and more. However, in this article I will focus specifically onData Securityand how your team can ensure a safe Copilot rollout.
Microsoft 365 Copilot use cases#
The use cases for generative AI with collaboration suites like M365 are limitless. It's easy to see why so many IT and security teams are eager to gain early access and prepare their rollout plans. The productivity gains will be huge.
For example, you can open a blank Word document and ask Copilot to draft a proposal for a client based on a target data set, which may include OneNote pages, PowerPoint presentations, and other Office documents. In just a few seconds, you'll get a complete proposal.
Here are more examples Microsoft gave during the event:
- Copilot can join your Teams meeting and summarize in real time what's being discussed, capture action items, and tell you what issues were left open in the meeting.
- Copilot in Outlook can help you sort your inbox, prioritize email, summarize topics, and generate responses for you.
- Copilot in Excel can analyze raw data and provide you with insights, trends, and recommendations.
How Microsoft 365 Copilot works#
Here's a quick overview of how to handle Copilot prompts:
- Users enter prompts in applications such as Word, Outlook, or PowerPoint.
- Microsoft based on user's M365 permissionsGather the user's business context.
- Tips are sent to an LLM (like GPT4) to generate a response.
- Microsoft performs the AI checks responsible for post-processing.
- Microsoft generates a response and sends the command back to the M365 application.
Image source: Microsoft |
Microsoft 365 Copilot security model#
For Microsoft, there's always been an extreme tension between productivity and security.
This has been demonstrated during the coronavirus when IT teams quickly deployed Microsoft Teams without first fully understanding how the underlying security model works or what the status of the organization's M365 permissions, group and link policies are.
good news:#
- Tenant isolation.Copilot only uses data from the current user's M365 tenant. The AI tool will not display data from other tenants where the user may be a guest, nor from any tenants for which cross-tenant synchronization may be set up.
- training boundaries.CopilotWon'tUse any of your business data to train Copilot on the underlying LLM used by all tenants. youNo needWorry about your proprietary data showing up in responses to other users in other tenants.
bad news:#
- permissions.Copilot displays all organization data for which each user has at least view permissions.
- Label.Content generated by CopilotWon'tInherits the MPIP label of the file from which Copilot gets responses.
- Humanity.The co-pilot's answers are no guarantee that 100% is authentic or safe; humans must be responsible for reviewing AI-generated content.
Let’s take the bad news one by one.
Permissions#
If the company can easily enforce least privileges in Microsoft 365, it would be a good idea to grant Copilot only the permissions the user has access to.
Microsoft states in its Copilot data security documentation:
"It's important that you use the permissions models available in Microsoft 365 services, such as SharePoint, to help ensure that the right users or groups have the right access to the right content within your organization."
Source: Data, Privacy, and Security for Microsoft 365 Copilot
However, we know from experience that most organizations fall far short of minimum privileges. Just take a look at some statistics from Microsoft's own Cloud Permissions Risk State Report.
This picture matches what Varonis sees when he performs thousands of data risk assessments each year for companies using Microsoft 365. In our report, The Big SaaS Data Exposure, we found that the average M365 tenant has:
- 40+ million unique permissions
- Publicly shared 113K+ sensitive records
- 27K+ shared links
Why does this happen? Microsoft 365 permissions are extremely complex. Just think about all the ways users access data:
- Direct user rights
- Microsoft 365 group permissions
- SharePoint local permissions (with custom levels)
- guest access
- external access
- public access
- Link access (anyone, organization-wide, direct, guest)
To make matters worse, permissions are primarily in the hands of the end user, not the IT or security team.
Label#
Microsoft relies heavily on sensitivity labels to enforce DLP policies, apply encryption, and broadly prevent data breaches. However, in practice, let the labelPlay a roleIt's difficult, especially if you rely on humans to apply sensitivity labels.
Microsoft paints a rosy picture of labeling and blocking as the ultimate safety net for data. Reality revealed a bleaker scenario. When humans create data, labels often lag or become outdated.
Blocking or encrypting data can add friction to workflows, and labeling technology is limited to specific file types. The more labels an organization has, the more likely it is that users will become confused. This is especially true for large organizations.
The efficacy of tag-based data protection will certainly decrease as we allow AI to generate orders of magnitude more data that require accurate and automatically updated tags.
Is my label okay?#
Varonis can verify and improve your organization's Microsoft sensitivity labels through scanning, discovery, and remediation:
- Sensitive documents without tags
- Sensitive files with incorrect labels
- Non-sensitive files with sensitive labels
Humanity#
Artificial intelligence will make humans lazy. The content generated by LLMs like GPT4 is not just good, it's awesome. In many cases, speed and mass far exceed human capabilities. As a result, people began to blindly trust AI to respond safely and accurately.
We have seen real-life scenarios where Copilot drafted proposals for clients and contained sensitive data belonging to completely different clients. The user clicks "send" after a quick browse (or no browse), and now you're faced with a privacy or data breach.
Prepare your tenants for Copilot security#
At the launch of CopilotBeforeUnderstanding your data security posture is critical. Now that Copilot is generally available, now is a good time to implement security controls.
Varonis protects thousands of Microsoft 365 customers with our data security platform, which provides a real-time view of risk and the ability to automatically enforce least privilege.
We can help you solve your biggest security risks with Copilot, with virtually no manual effort. With Varonis for Microsoft 365 you can:
- Automatically discover and classify all sensitive AI-generated content.
- Automatically ensures correct application of MPIP labels.
- Automatically enforce least privileges.
- Continuously monitor sensitive data in M365 and alert and respond to abnormal behavior.
Original article by Chief Security Officer, if reproduced, please credit https://cncso.com/en/preventing-risk-of-data-leakage-from-microsoft-copilot-html