CVE-2024-23605

8.8 HIGH

📋 TL;DR

A heap-based buffer overflow vulnerability in the GGUF library header.n_kv functionality of llama.cpp allows remote code execution when processing malicious .gguf files. This affects systems running vulnerable versions of llama.cpp or software that incorporates this library. Attackers can exploit this by providing specially crafted files to trigger the overflow.

💻 Affected Systems

Products:
  • llama.cpp
  • software using llama.cpp GGUF library
Versions: llama.cpp up to commit 18c2e17
Operating Systems: All platforms running llama.cpp
Default Config Vulnerable: ⚠️ Yes
Notes: Any application that processes .gguf files using the vulnerable library is affected.

📦 What is this software?

⚠️ Risk & Real-World Impact

🔴

Worst Case

Full remote code execution with the privileges of the llama.cpp process, potentially leading to complete system compromise.

🟠

Likely Case

Remote code execution leading to data theft, system takeover, or deployment of additional malware.

🟢

If Mitigated

Denial of service or application crash if memory protections prevent code execution.

🌐 Internet-Facing: HIGH - Attackers can deliver malicious files via web interfaces, APIs, or file upload services.
🏢 Internal Only: MEDIUM - Requires internal users to process malicious files, but lateral movement is possible after initial compromise.

🎯 Exploit Status

Public PoC: ⚠️ Yes
Weaponized: LIKELY
Unauthenticated Exploit: ⚠️ Yes
Complexity: LOW

Exploitation requires delivering a malicious .gguf file, which can be done via various attack vectors.

🛠️ Fix & Mitigation

✅ Official Fix

Patch Version: Commit after 18c2e17

Vendor Advisory: https://github.com/ggerganov/llama.cpp/security/advisories

Restart Required: Yes

Instructions:

1. Update llama.cpp to latest version. 2. Rebuild any dependent applications. 3. Restart affected services.

🔧 Temporary Workarounds

Input validation

all

Validate .gguf files before processing with external tools

Process isolation

linux

Run llama.cpp in sandboxed/containerized environments

docker run --read-only --security-opt=no-new-privileges

🧯 If You Can't Patch

  • Disable .gguf file processing functionality
  • Implement strict file upload controls and scanning for malicious files

🔍 How to Verify

Check if Vulnerable:

Check if llama.cpp version is at or before commit 18c2e17

Check Version:

git log --oneline | head -1

Verify Fix Applied:

Verify llama.cpp is updated to commit after 18c2e17

📡 Detection & Monitoring

Log Indicators:

  • Process crashes of llama.cpp
  • Memory access violation errors
  • Unexpected child processes spawned

Network Indicators:

  • Unusual outbound connections from llama.cpp process
  • File downloads to llama.cpp host

SIEM Query:

Process:llama.cpp AND (EventID:1000 OR EventID:1001)

🔗 References

📤 Share & Export