CVE-2025-47277: vLLM, an inference and serving engine for large language models (LLMs), has an issue in versions 0.6.5 through 0.8.4 tha | AI Sec Watch