CVE-2024-8982

6.2 MEDIUM

📋 TL;DR

This Local File Inclusion vulnerability in OpenLLM 0.6.10 allows attackers to read sensitive server files through the web application. Attackers can access configuration files, passwords, and other critical data, potentially leading to system compromise. All deployments running OpenLLM version 0.6.10 are affected.

💻 Affected Systems

Products:
  • OpenLLM
Versions: 0.6.10
Operating Systems: All
Default Config Vulnerable: ⚠️ Yes
Notes: All deployments of OpenLLM 0.6.10 are vulnerable regardless of configuration.

⚠️ Manual Verification Required

This CVE does not have specific version information in our database, so automatic vulnerability detection cannot determine if your system is affected.

Why? The CVE database entry doesn't specify which versions are vulnerable (no version ranges provided by the vendor/NVD).

🔒 Custom verification scripts are available for registered users. Sign up free to download automated test scripts.

Recommended Actions:
  1. Review the CVE details at NVD
  2. Check vendor security advisories for your specific version
  3. Test if the vulnerability is exploitable in your environment
  4. Consider updating to the latest version as a precaution

⚠️ Risk & Real-World Impact

🔴

Worst Case

Complete system compromise through credential theft, privilege escalation, and lateral movement within the network.

🟠

Likely Case

Sensitive data exposure including configuration files, passwords, and potentially private keys.

🟢

If Mitigated

Limited impact with proper file permissions and web server hardening, though vulnerability still exists.

🌐 Internet-Facing: HIGH
🏢 Internal Only: MEDIUM

🎯 Exploit Status

Public PoC: ⚠️ Yes
Weaponized: LIKELY
Unauthenticated Exploit: ⚠️ Yes
Complexity: LOW

LFI vulnerabilities are typically easy to exploit with publicly available tools and techniques.

🛠️ Fix & Mitigation

✅ Official Fix

Patch Version: 0.6.11 or later

Vendor Advisory: https://huntr.com/bounties/b7bdc9a1-51ac-402a-8e6e-0d977699aca6

Restart Required: Yes

Instructions:

1. Update OpenLLM to version 0.6.11 or later using pip: pip install --upgrade openllm==0.6.11
2. Restart the OpenLLM service
3. Verify the update was successful

🔧 Temporary Workarounds

Web Server File Access Restrictions

all

Configure web server to restrict file access to specific directories only

# For nginx: location ~ /\. { deny all; }
# For Apache: <DirectoryMatch "^\."> Require all denied </DirectoryMatch>

File Permission Hardening

linux

Restrict read permissions on sensitive files and directories

chmod 600 /etc/passwd /etc/shadow
chmod 700 /var/www/
chown root:root sensitive_config_files

🧯 If You Can't Patch

  • Implement strict input validation and sanitization for all file path parameters
  • Deploy a web application firewall (WAF) with LFI protection rules

🔍 How to Verify

Check if Vulnerable:

Check if OpenLLM version is 0.6.10: python -c "import openllm; print(openllm.__version__)"

Check Version:

python -c "import openllm; print(openllm.__version__)"

Verify Fix Applied:

Verify version is 0.6.11 or later: python -c "import openllm; print(openllm.__version__)" and confirm it's not 0.6.10

📡 Detection & Monitoring

Log Indicators:

  • Unusual file path requests in web server logs
  • Patterns like '../../' or absolute path requests
  • Requests for sensitive files like /etc/passwd

Network Indicators:

  • HTTP requests with file path traversal patterns
  • Unusual file downloads from the application

SIEM Query:

source="web_server_logs" AND (uri="*../*" OR uri="*/etc/passwd*" OR uri="*/etc/shadow*")

🔗 References

📤 Share & Export