Varonis found a “Reprompt” attack that let a single link hijack Microsoft Copilot Personal sessions and exfiltrate data; Microsoft patched it in January 2026.
I tried four vibe-coding tools, including Cursor and Replit, with no coding background. Here's what worked (and what didn't).
Reprompt impacted Microsoft Copilot Personal and, according to the team, gave "threat actors an invisible entry point to perform a data‑exfiltration chain that bypasses enterprise security controls ...
Microsoft has fixed a vulnerability in its Copilot AI assistant that allowed hackers to pluck a host of sensitive user data ...
As independent AI researcher Simon Willison wrote in a post distinguishing serious AI-assisted development from casual “ vibe ...