CVE-2024-8063
📋 TL;DR
A divide-by-zero vulnerability in ollama/ollama v0.3.3 allows attackers to cause denial of service by importing malicious GGUF models with crafted block_count values. This affects anyone running vulnerable ollama servers that process untrusted model files. The server crashes when processing the malicious model, disrupting AI inference services.
💻 Affected Systems
- ollama/ollama
📦 What is this software?
Ollama by Ollama
⚠️ Risk & Real-World Impact
Worst Case
Complete service outage of ollama server, disrupting all AI model inference capabilities and potentially affecting dependent applications.
Likely Case
Server crash requiring manual restart, causing temporary service disruption until recovery.
If Mitigated
No impact if models are validated before import or from trusted sources only.
🎯 Exploit Status
Exploitation requires crafting a malicious GGUF model file with specific block_count values and convincing the server to import it.
🛠️ Fix & Mitigation
✅ Official Fix
Patch Version: v0.3.4 or later
Vendor Advisory: https://github.com/ollama/ollama/security/advisories/GHSA-xxxx-xxxx-xxxx
Restart Required: No
Instructions:
1. Update ollama to v0.3.4 or later using your package manager. 2. For manual install: Download latest release from GitHub. 3. Replace existing binary. 4. No restart needed for patch application.
🔧 Temporary Workarounds
Restrict model imports
allOnly import models from trusted, verified sources. Implement validation for GGUF files before processing.
Network segmentation
allIsolate ollama servers from untrusted networks and restrict model upload capabilities.
🧯 If You Can't Patch
- Implement strict model validation: reject GGUF files with suspicious block_count values before processing.
- Monitor server logs for crash events and implement automated restart mechanisms for resilience.
🔍 How to Verify
Check if Vulnerable:
Check ollama version: ollama --version. If output shows v0.3.3, system is vulnerable.
Check Version:
ollama --version
Verify Fix Applied:
After update, run ollama --version and confirm version is v0.3.4 or higher.
📡 Detection & Monitoring
Log Indicators:
- Server crash logs with divide-by-zero errors
- Unexpected termination of ollama process
- Error messages related to GGUF model parsing
Network Indicators:
- Sudden drop in ollama service availability
- Failed model import requests
SIEM Query:
process:ollama AND (event:crash OR error:"divide by zero" OR error:GGUF)