Home / Technology / One link can steal your Copilot data
One link can steal your Copilot data
25 Jan
Summary
- A single link can hijack Copilot sessions undetected.
- The Reprompt attack bypasses some AI data protections.
- Microsoft fixed this vulnerability in January 2026.

Security researchers have uncovered a vulnerability in Microsoft Copilot, dubbed 'Reprompt,' that allows attackers to discreetly access personal data. The attack begins with a user clicking a malicious Copilot link, which embeds hidden instructions. These instructions can manipulate Copilot into performing actions and divulging information it would normally protect.
The Reprompt technique exploits parameters within Copilot's web address to execute hidden commands upon page load. Researchers combined this with a 'try twice' method to circumvent Copilot's initial data leak protections, finding that a second attempt could bypass stricter checks. Additionally, the attack enables continuous instruction from a remote server.
While Microsoft addressed this issue in its January 2026 Patch Tuesday updates, and no real-world exploitation was reported beforehand, the discovery highlights inherent risks with AI assistants. These tools, possessing access, memory, and the ability to act on a user's behalf, present significant privacy concerns if their safeguards fail.




