CVE-2024-21802
📋 TL;DR
A heap-based buffer overflow vulnerability in the GGUF library's info->ne functionality of llama.cpp allows remote code execution when processing malicious .gguf files. This affects systems using vulnerable versions of llama.cpp to parse GGUF model files. Attackers can exploit this by providing specially crafted files to trigger the vulnerability.
💻 Affected Systems
- llama.cpp
📦 What is this software?
Llama.cpp by Ggerganov
⚠️ Risk & Real-World Impact
Worst Case
Remote code execution with the privileges of the llama.cpp process, potentially leading to full system compromise.
Likely Case
Remote code execution leading to data theft, system compromise, or lateral movement within the network.
If Mitigated
Denial of service or application crash if exploit fails or is blocked by security controls.
🎯 Exploit Status
Exploitation requires providing a malicious GGUF file, which can be done remotely if the system accepts such files from untrusted sources.
🛠️ Fix & Mitigation
✅ Official Fix
Patch Version: Versions after commit 18c2e17
Vendor Advisory: https://github.com/ggerganov/llama.cpp/security/advisories
Restart Required: Yes
Instructions:
1. Update llama.cpp to the latest version from the official GitHub repository. 2. Recompile the application. 3. Restart any services using llama.cpp.
🔧 Temporary Workarounds
Input Validation
allImplement strict validation of GGUF files before processing, rejecting files with suspicious structures.
Sandbox Execution
allRun llama.cpp in a sandboxed environment with limited privileges to contain potential exploitation.
🧯 If You Can't Patch
- Restrict GGUF file sources to trusted repositories only.
- Implement network segmentation to isolate systems running vulnerable llama.cpp versions.
🔍 How to Verify
Check if Vulnerable:
Check if your llama.cpp version includes commit 18c2e17 or earlier. Run: git log --oneline | grep 18c2e17
Check Version:
git log --oneline -1
Verify Fix Applied:
Verify the commit hash is newer than 18c2e17. Run: git log --oneline -1
📡 Detection & Monitoring
Log Indicators:
- Unexpected process crashes of llama.cpp
- Abnormal memory usage patterns in llama.cpp processes
Network Indicators:
- Unusual outbound connections from llama.cpp processes
- Suspicious file uploads to endpoints accepting GGUF files
SIEM Query:
Process:llama.cpp AND (EventID:1000 OR MemoryUsage > threshold)