Microsoft has addressed a significant vulnerability in its Microsoft 365 Copilot, which allowed for data exfiltration through a method known as ASCII smuggling. The exploit, reported earlier this year, could have enabled attackers to steal personally identifiable information (PII) via phishing emails and prompt injection techniques. This flaw highlighted the potential risks associated with advanced AI applications. The company has now patched the vulnerability, which could have led to substantial data breaches if left unaddressed. Experts have noted that this incident underscores the hidden dangers in the deployment of AI technologies, particularly in handling sensitive user data.
From Copilot to Copirate: How data thieves could hijack Microsoft's chatbot https://t.co/gv8exdD0sQ
A patched vulnerability in @MSFTCopilot could expose sensitive data by running a novel AI-enabled technique known as "ASCII Smuggling" that uses special Unicode characters that mirror ASCII text, but are actually not visible to the user interface. #infosec https://t.co/6WUiwQLZHM
Microsoft Fixes ASCII Smuggling Flaw That Enabled Data Theft from Microsoft 365 Copilot https://t.co/u42vh0mc6v #cybersecurity