CVE-2021-37404

9.8 CRITICAL

📋 TL;DR

CVE-2021-37404 is a critical heap buffer overflow vulnerability in Apache Hadoop's libhdfs native code that allows attackers to cause denial of service or execute arbitrary code by providing malicious file paths. This affects Hadoop deployments using the native HDFS client library. Organizations running vulnerable Hadoop versions are at risk.

💻 Affected Systems

Products:
  • Apache Hadoop
Versions: Apache Hadoop versions before 2.10.2, 3.2.3, and 3.3.2
Operating Systems: All operating systems running Hadoop
Default Config Vulnerable: ⚠️ Yes
Notes: Only affects deployments using the native libhdfs library. Pure Java implementations are not vulnerable.

📦 What is this software?

⚠️ Risk & Real-World Impact

🔴

Worst Case

Remote code execution with root/administrator privileges leading to complete system compromise, data exfiltration, and lateral movement within the Hadoop cluster.

🟠

Likely Case

Denial of service causing Hadoop services to crash, disrupting data processing pipelines and cluster operations.

🟢

If Mitigated

Limited impact if proper network segmentation and least privilege access controls prevent exploitation attempts.

🌐 Internet-Facing: HIGH - If Hadoop services are exposed to the internet, attackers can directly exploit this vulnerability without authentication.
🏢 Internal Only: MEDIUM - Internal attackers or compromised systems within the network could exploit this, but requires access to Hadoop services.

🎯 Exploit Status

Public PoC: ✅ No
Weaponized: UNKNOWN
Unauthenticated Exploit: ⚠️ Yes
Complexity: LOW

The vulnerability requires user-provided file paths, which could be supplied through various Hadoop interfaces or APIs.

🛠️ Fix & Mitigation

✅ Official Fix

Patch Version: Apache Hadoop 2.10.2, 3.2.3, 3.3.2 or higher

Vendor Advisory: https://lists.apache.org/thread/2h56ztcj3ojc66qzf1nno88vjw9vd4wo

Restart Required: Yes

Instructions:

1. Download the patched version from Apache Hadoop website. 2. Backup current configuration and data. 3. Stop all Hadoop services. 4. Install the patched version. 5. Restore configuration. 6. Restart Hadoop services. 7. Verify functionality.

🔧 Temporary Workarounds

Disable native libhdfs

all

Configure Hadoop to use pure Java HDFS client instead of native libhdfs

Set hadoop.native.lib to false in core-site.xml

Restrict file path input

all

Implement input validation and sanitization for all user-provided file paths

🧯 If You Can't Patch

  • Implement strict network segmentation to isolate Hadoop clusters from untrusted networks
  • Apply principle of least privilege and restrict user access to Hadoop services

🔍 How to Verify

Check if Vulnerable:

Check Hadoop version using 'hadoop version' command and compare against vulnerable versions

Check Version:

hadoop version

Verify Fix Applied:

Verify version is 2.10.2+, 3.2.3+, or 3.3.2+ and test file operations with various path inputs

📡 Detection & Monitoring

Log Indicators:

  • Unusual file path patterns in Hadoop logs
  • Hadoop service crashes or restarts
  • Stack traces containing libhdfs or buffer overflow references

Network Indicators:

  • Unusual connections to Hadoop services from unexpected sources
  • Multiple failed file operations with malformed paths

SIEM Query:

source="hadoop.log" AND ("libhdfs" OR "buffer overflow" OR "segmentation fault")

🔗 References

📤 Share & Export