CVE-2024-4897

8.4 HIGH

📋 TL;DR

This vulnerability allows remote attackers to execute arbitrary code on systems running lollms-webui by uploading malicious model files through the binding_zoo feature. The vulnerability stems from an unpatched dependency (llama-cpp-python) that processes GGUF format model files unsafely. All users running vulnerable versions of lollms-webui are affected.

💻 Affected Systems

Products:
  • parisneo/lollms-webui
Versions: All versions up to and including commit b454f40a
Operating Systems: Linux, Windows, macOS
Default Config Vulnerable: ⚠️ Yes
Notes: Vulnerability requires the binding_zoo feature to be enabled and accessible. The vulnerable dependency is llama-cpp-python version 0.2.61+cpuavx2-cp311-cp311-manylinux_2_31_x86_64.

📦 What is this software?

⚠️ Risk & Real-World Impact

🔴

Worst Case

Complete system compromise allowing attackers to execute arbitrary commands, install malware, exfiltrate data, or pivot to other systems.

🟠

Likely Case

Remote code execution leading to data theft, cryptocurrency mining, or botnet enrollment.

🟢

If Mitigated

Limited impact if proper network segmentation and least privilege principles are implemented.

🌐 Internet-Facing: HIGH - The binding_zoo feature allows external model downloads from Hugging Face, making internet-facing instances particularly vulnerable.
🏢 Internal Only: MEDIUM - Internal instances are still vulnerable if attackers gain internal network access or through insider threats.

🎯 Exploit Status

Public PoC: ⚠️ Yes
Weaponized: LIKELY
Unauthenticated Exploit: ⚠️ Yes
Complexity: LOW

Exploitation is straightforward via the binding_zoo feature. The vulnerability is linked to CVE-2024-34359 in llama-cpp-python.

🛠️ Fix & Mitigation

✅ Official Fix

Patch Version: Not available

Vendor Advisory: https://huntr.com/bounties/ecf386df-4b6a-40b2-9000-db0974355acc

Restart Required: Yes

Instructions:

1. Monitor the lollms-webui GitHub repository for updates. 2. Once a patch is released, update to the fixed version. 3. Update the llama-cpp-python dependency to a version that addresses CVE-2024-34359.

🔧 Temporary Workarounds

Disable binding_zoo feature

all

Prevent model uploads and downloads through the vulnerable feature

# Edit lollms-webui configuration to disable binding_zoo
# Look for binding_zoo settings in config files and set to disabled

Network isolation

linux

Restrict network access to lollms-webui instances

# Firewall rule to block external access
sudo iptables -A INPUT -p tcp --dport [lollms-port] -j DROP

🧯 If You Can't Patch

  • Disable the binding_zoo feature completely in configuration
  • Implement strict network controls to limit access to lollms-webui instances

🔍 How to Verify

Check if Vulnerable:

Check if lollms-webui version is at or before commit b454f40a and if llama-cpp-python version 0.2.61+cpuavx2-cp311-cp311-manylinux_2_31_x86_64 is installed.

Check Version:

Check lollms-webui version in the application interface or configuration files. For llama-cpp-python: pip show llama-cpp-python

Verify Fix Applied:

Verify that binding_zoo feature is disabled or that llama-cpp-python dependency has been updated to a version patched for CVE-2024-34359.

📡 Detection & Monitoring

Log Indicators:

  • Unusual model file downloads from Hugging Face
  • Suspicious process execution following model uploads
  • Errors in llama-cpp-python processing logs

Network Indicators:

  • Unexpected outbound connections from lollms-webui instances
  • Downloads from huggingface.co to unexpected IP addresses

SIEM Query:

source="lollms-webui" AND (event="model_upload" OR event="binding_zoo_download") AND status="success"

🔗 References

📤 Share & Export