Skip to content

Add configurable limit for the maximum number of bytes/chars of content to parse before failing #1046

Closed
@cowtowncoder

Description

@cowtowncoder

(note: part of #637)

Jackson 2.15 included a few processing limits that can be applied to limit processing for "too big content"; first focusing on general nesting depth and max. length of individual tokens.
While this is good first step, it also makes sense to offer a simple way to limit maximum content in total allowed to be read -- typically a maximum document size, but in case of line-delimited input, maximum streaming content.

The reasoning for addition of such feature is that although users can -- if they must -- implement this at yet lower level (length-limited InputStream, for example), there are some benefits from Jackson streaming component offering this:

  1. Less work for user (obviously), better accessibility leading to wider adoption and helping against possible DoS vectors
  2. Better integration via well-defined exception type common to constraints violations (StreamConstraintsException)
  3. More reliable limits when shared implementation used (i.e. less like users/devs implement faulty limits checks)

Note, too, that this feature significantly improves usefulness (or right now, lack thereof) of #863 to combine per-token limits with overall limits.

NOTE: the default setting for this limits should, however, be left as "unlimited": using anything else is likely to break some processing somewhere.
Limit has to be defined as 64-bit long (not int); default value to use then is likely Long.MAX_VALUE.

Metadata

Metadata

Assignees

No one assigned

    Labels

    2.16Issue planned (at earliest) for 2.16processing-limitsIssues related to limiting aspects of input/output that can be processed without exception

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions