CVE-2025-48944: vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, th | AI Sec Watch