CVE-2025-61920

7.5 HIGH

📋 TL;DR

This vulnerability in Authlib allows remote attackers to craft malicious JWT tokens with extremely large header or signature segments, causing excessive CPU and memory consumption during parsing. This enables denial-of-service attacks against applications using vulnerable versions of Authlib. All systems using Authlib versions before 1.6.5 are affected.

💻 Affected Systems

Products:
  • Authlib
Versions: All versions prior to 1.6.5
Operating Systems: All platforms running Python
Default Config Vulnerable: ⚠️ Yes
Notes: Any Python application using Authlib for OAuth/OpenID Connect token validation is vulnerable by default.

📦 What is this software?

⚠️ Risk & Real-World Impact

🔴

Worst Case

Complete service outage due to resource exhaustion, potentially affecting multiple services if authentication infrastructure is shared.

🟠

Likely Case

Service degradation or temporary unavailability for targeted applications, requiring restart of affected services.

🟢

If Mitigated

Minimal impact with proper input validation and rate limiting in place.

🌐 Internet-Facing: HIGH
🏢 Internal Only: MEDIUM

🎯 Exploit Status

Public PoC: ✅ No
Weaponized: LIKELY
Unauthenticated Exploit: ⚠️ Yes
Complexity: LOW

Exploitation requires only the ability to submit crafted tokens to vulnerable endpoints.

🛠️ Fix & Mitigation

✅ Official Fix

Patch Version: 1.6.5

Vendor Advisory: https://github.com/authlib/authlib/security/advisories/GHSA-pq5p-34cr-23v9

Restart Required: Yes

Instructions:

1. Update Authlib: pip install --upgrade authlib==1.6.5
2. Restart all services using Authlib
3. Verify the update with: pip show authlib

🔧 Temporary Workarounds

Input size limiting

all

Implement request size limits before tokens reach Authlib parsing

# Configure web server limits (nginx example):
client_max_body_size 1M;
# Configure application server limits (gunicorn example):
--limit-request-line 8190
--limit-request-fields 100
--limit-request-field_size 8190

Rate limiting

all

Implement application-level throttling to reduce amplification risk

# Example using Flask-Limiter:
from flask_limiter import Limiter
limiter = Limiter(app, default_limits=["100 per minute"])

🧯 If You Can't Patch

  • Implement strict input validation to reject tokens exceeding reasonable size limits (e.g., >10KB)
  • Deploy WAF rules to block requests with abnormally large Authorization headers or token payloads

🔍 How to Verify

Check if Vulnerable:

Check Authlib version: pip show authlib | grep Version

Check Version:

pip show authlib | grep Version

Verify Fix Applied:

Verify version is 1.6.5 or higher and test with sample tokens to ensure proper size validation

📡 Detection & Monitoring

Log Indicators:

  • Unusually large request sizes to authentication endpoints
  • High memory/CPU usage spikes during token validation
  • Failed authentication attempts with malformed tokens

Network Indicators:

  • Large Authorization headers (>1MB)
  • Multiple authentication requests from single source in short time

SIEM Query:

source="auth_logs" AND (request_size>1048576 OR token_length>10000) OR (process="python" AND (memory_usage>90% OR cpu_usage>90%))

🔗 References

📤 Share & Export