GitHub Copilot: Remote Code Execution via Prompt Injection (CVE-2025-53773)
Summary
GitHub Copilot and VS Code are vulnerable to prompt injection (tricking an AI by hiding instructions in its input) that allows an attacker to achieve RCE (remote code execution, where an attacker can run commands on a system they don't own) by modifying a project's settings.json file to put Copilot into 'YOLO mode'. This vulnerability demonstrates a broader security risk: if an AI agent can write to files and modify its own configuration or security settings, it can be exploited for full system compromise.
Classification
Affected Vendors
Related Issues
Original source: https://embracethered.com/blog/posts/2025/github-copilot-remote-code-execution-via-prompt-injection/
First tracked: February 12, 2026 at 02:20 PM
Classified by LLM (prompt v3) · confidence: 85%