Описание
vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
EPSS
6.5 Medium
CVSS3
Дефекты
Связанные уязвимости
vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
vLLM is an inference and serving engine for large language models (LLM ...
vLLM DOS: Remotely kill vllm over http with invalid JSON schema
Уязвимость интерфейса v1/completions библиотеки для работы с большими языковыми моделями (LLM) vLLM, позволяющая нарушителю вызвать отказ в обслуживании
EPSS
6.5 Medium
CVSS3