Machine Learning Attack Series: Image Scaling Attacks
Summary
This post introduces image scaling attacks, a type of adversarial attack (manipulating inputs to fool AI systems) that targets machine learning models through image preprocessing. The author discovered this attack concept while preparing demos and references academic research on understanding and preventing these attacks.
Classification
Affected Vendors
Related Issues
Original source: https://embracethered.com/blog/posts/2020/husky-ai-image-rescaling-attacks/
First tracked: February 12, 2026 at 02:20 PM
Classified by LLM (prompt v3) · confidence: 75%