CVE-2026-34760: vLLM is an inference and serving engine for large language models (LLMs). From version 0.5.5 to before version 0.18.0, L | AI Sec Watch