CVE-2024-12766
📋 TL;DR
This SSRF vulnerability in parisneo/lollms-webui allows attackers to make the server send unauthorized HTTP requests to internal or external systems, potentially accessing sensitive resources. It affects version V13 (feather) installations where the vulnerable API endpoint is exposed. Attackers can exploit this by sending specially crafted JSON payloads to the proxy endpoint.
💻 Affected Systems
- parisneo/lollms-webui
📦 What is this software?
⚠️ Risk & Real-World Impact
Worst Case
Attackers could access internal services, exfiltrate sensitive data, or pivot to other systems using the server's network position and credentials.
Likely Case
Unauthorized access to internal web resources, potential credential theft from services accessible to the server, and data exfiltration.
If Mitigated
Limited impact if network segmentation restricts server access and proper authentication controls are in place.
🎯 Exploit Status
Exploitation requires access to the API endpoint and knowledge of the vulnerable parameter. No authentication bypass is mentioned.
🛠️ Fix & Mitigation
✅ Official Fix
Patch Version: Check vendor advisory for specific version
Vendor Advisory: https://huntr.com/bounties/a143a2e2-1293-4dec-b875-3312584bd2b1
Restart Required: No
Instructions:
1. Update to the latest patched version of lollms-webui. 2. Verify the patch addresses the SSRF vulnerability in the /api/proxy endpoint. 3. Test the fix by attempting to exploit the vulnerability.
🔧 Temporary Workarounds
Disable vulnerable endpoint
allTemporarily disable or restrict access to the /api/proxy endpoint
Configure web server or application firewall to block POST requests to /api/proxy
Input validation
allImplement strict URL validation and whitelisting for the proxy endpoint
Modify application code to validate 'url' parameter against allowed domains only
🧯 If You Can't Patch
- Implement network segmentation to restrict server outbound connections
- Deploy web application firewall with SSRF protection rules
🔍 How to Verify
Check if Vulnerable:
Test by sending POST request to /api/proxy with {"url":"http://internal-service"} and check if server makes the request
Check Version:
Check lollms-webui version in application interface or configuration files
Verify Fix Applied:
Attempt the same test after patching - should receive error or be blocked
📡 Detection & Monitoring
Log Indicators:
- Unusual POST requests to /api/proxy endpoint
- Outbound HTTP requests from server to unexpected destinations
Network Indicators:
- Server making HTTP requests to internal IP addresses or unusual domains
- Increased outbound traffic from server
SIEM Query:
source_ip=server_ip AND dest_port=80 OR dest_port=443 AND http_method=POST AND uri_path="/api/proxy"