{"data":{"id":"4b87576b-a35d-4a16-b1e9-06609f9bbc73","title":"The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought","summary":"Teenage boys are using AI \"nudify\" apps to create deepfake sexual imagery (fake nude photos or videos created by AI) of their female classmates, which are then shared on social media and messaging apps. Since 2023, this has affected over 600 students across at least 28 countries and nearly 90 schools, with the true scale likely much higher. The explicit imagery involving minors constitutes child sexual abuse material (CSAM), and schools and law enforcement are often unprepared to respond to these serious incidents.","solution":"N/A -- no mitigation discussed in source.","labels":["safety","policy"],"sourceUrl":"https://www.wired.com/story/deepfake-nudify-schools-global-crisis/","publishedAt":"2026-04-15T10:00:00.000Z","cveId":null,"cweIds":null,"cvssScore":null,"cvssSeverity":null,"severity":"info","attackType":["jailbreak"],"issueType":"news","affectedPackages":null,"affectedVendors":[],"affectedVendorsRaw":["generative AI","nudify apps"],"classifierModel":"claude-haiku-4-5-20251001","classifierPromptVersion":"v3","cvssVector":null,"attackVector":null,"attackComplexity":null,"privilegesRequired":null,"userInteraction":null,"exploitMaturity":null,"epssScore":null,"patchAvailable":null,"disclosureDate":"2026-04-15T10:00:00.000Z","capecIds":null,"crossRefCount":0,"attackSophistication":"trivial","impactType":["safety"],"aiComponentTargeted":null,"llmSpecific":false,"classifierConfidence":0.85,"researchCategory":null,"atlasIds":null}}