The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought
Summary
Teenage boys are using AI "nudify" apps to create deepfake sexual imagery (fake nude photos or videos created by AI) of their female classmates, which are then shared on social media and messaging apps. Since 2023, this has affected over 600 students across at least 28 countries and nearly 90 schools, with the true scale likely much higher. The explicit imagery involving minors constitutes child sexual abuse material (CSAM), and schools and law enforcement are often unprepared to respond to these serious incidents.
Classification
Affected Vendors
Related Issues
Original source: https://www.wired.com/story/deepfake-nudify-schools-global-crisis/
First tracked: April 15, 2026 at 08:00 AM
Classified by LLM (prompt v3) · confidence: 85%