CVE-2025-59425: vLLM is an inference and serving engine for large language models (LLMs). Before version 0.11.0rc2, the API key support | AI Sec Watch