Add configurable limit for the maximum number of bytes/chars of content to parse before failing #1046
Labels
2.16
Issue planned (at earliest) for 2.16
processing-limits
Issues related to limiting aspects of input/output that can be processed without exception
(note: part of #637)
Jackson 2.15 included a few processing limits that can be applied to limit processing for "too big content"; first focusing on general nesting depth and max. length of individual tokens.
While this is good first step, it also makes sense to offer a simple way to limit maximum content in total allowed to be read -- typically a maximum document size, but in case of line-delimited input, maximum streaming content.
The reasoning for addition of such feature is that although users can -- if they must -- implement this at yet lower level (length-limited
InputStream
, for example), there are some benefits from Jackson streaming component offering this:StreamConstraintsException
)Note, too, that this feature significantly improves usefulness (or right now, lack thereof) of #863 to combine per-token limits with overall limits.
NOTE: the default setting for this limits should, however, be left as "unlimited": using anything else is likely to break some processing somewhere.
Limit has to be defined as 64-bit
long
(notint
); default value to use then is likelyLong.MAX_VALUE
.The text was updated successfully, but these errors were encountered: