Closed
Description
Using filesystem stream read a large file for size ~6GB having random contents, write the read contents to the console using console.log() or process.stdout.write(), node hangs for about a minute and dumps core.
No issues noted while piping the read stream to fs.writable stream() & file written successful.
btw, executing "cat " on the same large file completes successfully.
Node script:
var fs = require("fs");
var fstream = fs.createReadStream("randfile.txt");
fstream.on("readable", ()=> {
process.stdout.write(fstream.read());
});
<--- Last few GCs --->
3022 ms: Mark-sweep 1334.3 (1458.1) -> 1334.3 (1458.1) MB, 21.2 / 0 ms [last resort gc].
3024 ms: Scavenge 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 0.8 / 0 ms [allocation failure].
3024 ms: Scavenge 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 0.5 / 0 ms [allocation failure].
3045 ms: Mark-sweep 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 20.8 / 0 ms [last resort gc].
3066 ms: Mark-sweep 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 21.2 / 0 ms [last resort gc].
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0x2cd1a77e3ac1 <JS Object>
1: toString [buffer.js:400] [pc=0x323ae154b0f6] (this=0x12e5a99444f9 <an Uint8Array with map 0x21206d5054f1>)
2: /* anonymous */ [/home/arun/workspace/nodews/learning/stream/longfileread.js:~5] [pc=0x323ae159132f] (this=0x3a75328eb1c1 <a ReadStream with map 0x21206d518d69>)
3: emit [events.js:~130] [pc=0x323ae159a476] (this=0x3a75328eb1c1 <a ReadStream with map 0x21206d518d69>,type=...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Aborted (core dumped)
[bnoordhuis - fixed formatting]