A patched vulnerability in Microsoft 365 Copilot could expose sensitive data by running a novel AI-enabled technique known as “ASCII Smuggling” that uses special Unicode characters that mirror ASCII text, but are actually not visible to the user interface.
Researcher Johann Rehberger, who spent many years at Microsoft, explained in an Aug. 26 blog post that ASCII Smuggling would let an attacker make the large language model (LLM) render the data invisible to the user interface and embed it with clickable hyperlinks with malicious code — setting the stage for data exfiltration.
Jason Soroko, senior fellow at Sectigo, said that the ASCII Smuggling flaw in Microsoft 365 Copilot lets attackers hide the malicious code within seemingly harmless text using special Unicode characters. These characters resemble ASCII, said Soroko, but are invisible in the user interface, allowing the attacker to embed hidden data within clickable hyperlinks.
“When a user interacts with these links, the hidden data can be exfiltrated to a third-party server, potentially compromising sensitive information, such as MFA one-time-password codes,” said Soroko.
Soroko said the attack works by stringing together multiple methods: First, a prompt injection gets triggered by sharing a malicious document in a chat. Then, Copilot is manipulated to search for more sensitive data, and finally, ASCII Smuggling is used to trick the user into clicking on an exfiltration link.
“To mitigate this risk, users should ensure their Microsoft 365 software is updated, as Microsoft has patched the vulnerability,” said Soroko. “Additionally, they should exercise caution when interacting with links in documents and emails, especially those received from unknown or untrusted sources. Regular monitoring of AI tools like Copilot for unusual behavior is also essential to catch and respond to any suspicious activity quickly.”
Researcher Rehberger added that while it’s unclear how exactly Microsoft fixed the vulnerability and what mitigation recommendations were implemented, the exploits Rehberger built and shared with Microsoft in January and February do not work anymore, so it appeared that links are not rendered anymore since a few months ago.
“I asked MSRC if the team would be willing to share the details around the fix, so others in the industry could learn from their expertise, but did not get a response for that inquiry,” said Rehberger. “Just in case you are wondering, prompt injection, of course, is still possible.“
Evolving nature of AI attacks
This ASCII Smuggling technique highlights the evolving sophistication of AI-enabled attacks, where seemingly innocuous content can conceal malicious payloads capable of exfiltrating sensitive data, said Stephen Kowski, Field CTO at SlashNext Email Security. Kowski said organizations should implement advanced threat detection systems that can analyze content across multiple communication channels, including email, chat, and collaboration platforms.
“These solutions should leverage AI and machine learning to identify subtle anomalies and hidden malicious patterns that traditional security measures might miss,” said Kowski. “Additionally, continuous employee education on emerging threats and the implementation of strict access controls and data loss prevention measures are crucial in mitigating the risks posed by these innovative attack vectors.”
LLMs such as Microsoft 365 Copilot introduce significant risks when exploited by malicious actors, said Matan Getz, co-founder and CEO at Aim Security. Along with ASCII Smuggling, Getz said his team’s also concerned with threat actors creating phishing emails that closely mimic legitimate communications.
“Given that Microsoft 365 Copilot is integrated with employees’ email accounts, attackers can craft content that appears genuine while embedding malicious links or attachments,” said Getz. “While LLMs offer immense potential, they must be used with caution. We anticipate that attackers will continue to become more creative and unpredictable, exploiting human vulnerabilities.”
© Copyright 2024 CNB Tel. All rights reserved