CVE-2025-46722: vLLM is an inference and serving engine for large language models (LLMs). In versions starting from 0.7.0 to before 0.9. | AI Sec Watch