CVE-2025-48942: vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, h
Summary
vLLM (an inference and serving engine for large language models) versions 0.8.0 through 0.8.x have a vulnerability where sending an invalid JSON schema as a parameter to the /v1/completions API endpoint causes the server to crash. This happens because the application doesn't properly handle (catch) exceptions that occur when processing malformed input.
Solution / Mitigation
Update to vLLM version 0.9.0 or later, which fixes the issue.
Vulnerability Details
6.5(medium)
EPSS: 0.1%
Classification
Taxonomy References
Affected Vendors
Related Issues
CVE-2022-29200: TensorFlow is an open source platform for machine learning. Prior to versions 2.9.0, 2.8.1, 2.7.2, and 2.6.4, the implem
CVE-2021-29541: TensorFlow is an end-to-end open source platform for machine learning. An attacker can trigger a dereference of a null p
Original source: https://nvd.nist.gov/vuln/detail/CVE-2025-48942
First tracked: February 15, 2026 at 08:44 PM
Classified by LLM (prompt v3) · confidence: 95%