CVE-2025-30202: vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Versions starting from 0.5.2 and p
Summary
vLLM versions 0.5.2 through 0.8.4 have a security vulnerability in multi-node deployments where a ZeroMQ socket (a tool for sending messages between different computers) is left open to all network interfaces. An attacker with network access can connect to this socket to see internal vLLM data or deliberately slow down the system by connecting repeatedly without reading the data, causing a denial of service (making the system unavailable or very slow).
Solution / Mitigation
This issue has been patched in version 0.8.5. Update vLLM to version 0.8.5 or later.
Vulnerability Details
7.5(high)
EPSS: 0.4%
Classification
Affected Vendors
Related Issues
CVE-2025-45150: Insecure permissions in LangChain-ChatGLM-Webui commit ef829 allows attackers to arbitrarily view and download sensitive
CVE-2022-29200: TensorFlow is an open source platform for machine learning. Prior to versions 2.9.0, 2.8.1, 2.7.2, and 2.6.4, the implem
Original source: https://nvd.nist.gov/vuln/detail/CVE-2025-30202
First tracked: February 15, 2026 at 08:44 PM
Classified by LLM (prompt v3) · confidence: 95%