CVE-2026-22778: vLLM is an inference and serving engine for large language models (LLMs). From 0.8.3 to before 0.14.1, when an invalid i | AI Sec Watch