CVE-2026-22807: vLLM is an inference and serving engine for large language models (LLMs). Starting in version 0.10.1 and prior to versio | AI Sec Watch