Description
- Version: v12.13.0
- Platform: Windows 10 (64bit)
- Subsystem: string_decoder, stream
What steps will reproduce the bug?
const buf = Buffer.alloc(1024 * 1024 * 1024, 'a');
const sd = new (require('string_decoder').StringDecoder)();
sd.write(buf).length
How often does it reproduce? Is there a required condition?
The above causes the issue 100% reliably for me.
What is the expected behavior?
Either the decoded buffer or an exception explaining that the input buffer is too long.
What do you see instead?
TypeError: Cannot read property 'length' of undefined
So sd.write(large buffer) returns undefined
The reason I'm including stream as an affected subsystem is this line which originally led me to this issue: https://github.com/nodejs/readable-stream/blob/040b813e60ecf0d68ac1461a3fc3157ea5785950/lib/_stream_readable.js#L277
So what happens is that if you have a stream (a pipe used for IPC in my case) and the other side sends an large chunk of data, the recipient throws an exception before any application code is invoked that could handle it gracefully.
Additional information
I don't know exactly how large the buffer has to be. In my application (electron 8.5.2 with node 12.13.0) this started happening at around 600MB, when reproducing in node on the command line I had to increase the buffer to 1GB.
Just to clarify: I understand that there will be a limit on how large these buffers can get, I just wish it would report an error that my application code can handle because right now I don't see how I could write robust client code that can deal with the other side sending crap.