CVE-2025-3264

5.3 MEDIUM

📋 TL;DR

A Regular Expression Denial of Service (ReDoS) vulnerability in Hugging Face Transformers library allows attackers to cause excessive CPU consumption by providing specially crafted input strings to the `get_imports()` function. This affects users running Transformers version 4.49.0 for model serving, code loading, or development pipelines. The vulnerability can lead to service disruption and resource exhaustion.

💻 Affected Systems

Products:
  • Hugging Face Transformers
Versions: Version 4.49.0 only
Operating Systems: All operating systems running Python
Default Config Vulnerable: ⚠️ Yes
Notes: Only affects the specific version 4.49.0; earlier and later versions are not vulnerable. Requires the `get_imports()` function to be called with malicious input.

📦 What is this software?

⚠️ Risk & Real-World Impact

🔴

Worst Case

Complete denial of service in model serving environments, disrupting AI inference services and potentially causing cascading failures in dependent systems.

🟠

Likely Case

Temporary service degradation or disruption during code loading operations, particularly affecting development pipelines and model deployment workflows.

🟢

If Mitigated

Minimal impact with proper input validation, rate limiting, and updated library versions.

🌐 Internet-Facing: MEDIUM - Exploitable via API endpoints that process user-controlled Python code or module imports, but requires specific input patterns.
🏢 Internal Only: MEDIUM - Development environments and internal model serving could be affected by malicious code or supply chain attacks.

🎯 Exploit Status

Public PoC: ⚠️ Yes
Weaponized: LIKELY
Unauthenticated Exploit: ✅ No
Complexity: LOW

Proof of concept available in public bounty reports. Exploitation requires ability to provide input to the vulnerable function, which typically requires some level of access or API interaction.

🛠️ Fix & Mitigation

✅ Official Fix

Patch Version: 4.51.0

Vendor Advisory: https://github.com/huggingface/transformers/commit/0720e206c6ba28887e4d60ef60a6a089f6c1cc76

Restart Required: No

Instructions:

1. Update Hugging Face Transformers using pip: `pip install transformers>=4.51.0` 2. Verify the update with `pip show transformers` 3. Test that your application functions correctly with the new version.

🔧 Temporary Workarounds

Input Validation and Sanitization

all

Implement strict input validation for any user-provided code or strings that might reach the `get_imports()` function.

Version Downgrade

all

Temporarily downgrade to version 4.48.0 or earlier until patching is possible.

pip install transformers==4.48.0

🧯 If You Can't Patch

  • Implement rate limiting and timeout mechanisms for code processing operations
  • Isolate Transformers processing to dedicated containers with CPU limits and monitoring

🔍 How to Verify

Check if Vulnerable:

Check Transformers version with: `python -c "import transformers; print(transformers.__version__)"` and verify if it equals '4.49.0'

Check Version:

python -c "import transformers; print(transformers.__version__)"

Verify Fix Applied:

After updating, verify version is 4.51.0 or higher with same command

📡 Detection & Monitoring

Log Indicators:

  • Unusually high CPU usage in Transformers processes
  • Timeout errors in code loading operations
  • Repeated failed import attempts

Network Indicators:

  • Increased latency in model serving endpoints
  • Timeout responses from AI inference services

SIEM Query:

process.name:"python" AND process.args:"transformers" AND cpu.usage > 90%

🔗 References

📤 Share & Export