CVE-2026-21869
📋 TL;DR
This vulnerability in llama.cpp allows remote attackers to cause memory corruption by sending specially crafted JSON with negative n_discard values to completion endpoints. This can lead to crashes or remote code execution. Anyone running llama.cpp server with affected commits is vulnerable.
💻 Affected Systems
- llama.cpp
📦 What is this software?
⚠️ Risk & Real-World Impact
Worst Case
Remote code execution leading to complete system compromise
Likely Case
Process crash causing denial of service
If Mitigated
No impact if input validation is implemented
🎯 Exploit Status
Vulnerability is deterministic and requires no special conditions beyond filling the context
🛠️ Fix & Mitigation
✅ Official Fix
Patch Version: None
Vendor Advisory: https://github.com/ggml-org/llama.cpp/security/advisories/GHSA-8947-pfff-2f3c
Restart Required: Yes
Instructions:
No official patch available. Monitor GitHub repository for updates and apply when released.
🔧 Temporary Workarounds
Input validation middleware
allAdd input validation to reject negative n_discard values before they reach llama.cpp
Disable completion endpoints
allRemove or disable HTTP completion endpoints if not required
🧯 If You Can't Patch
- Isolate llama.cpp server behind strict network controls
- Implement WAF rules to block requests with negative numeric values in JSON payloads
🔍 How to Verify
Check if Vulnerable:
Check git commit hash against vulnerable range (up to 55d4206c8)
Check Version:
git log --oneline -1
Verify Fix Applied:
Test with negative n_discard values to ensure proper validation or rejection
📡 Detection & Monitoring
Log Indicators:
- Process crashes
- Memory access violation errors
- HTTP requests with negative JSON values
Network Indicators:
- HTTP POST requests to completion endpoints with crafted JSON payloads
SIEM Query:
http.method:POST AND http.uri:*completion* AND (json.n_discard:<0 OR json.*:-[0-9]+)