Skip to content

Commit 2b4c496

Browse files
committed
move stdin.read example to streams.html readable.read() and link to it from process.html stdin
1 parent 990e4a6 commit 2b4c496

File tree

2 files changed

+38
-48
lines changed

2 files changed

+38
-48
lines changed

doc/api/process.md

Lines changed: 3 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -2254,6 +2254,8 @@ The `process.stdin` property returns a stream connected to
22542254
stream) unless fd `0` refers to a file, in which case it is
22552255
a [Readable][] stream.
22562256

2257+
For details of how to read from `stdin` see [`readable.read()`][].
2258+
22572259
As a [Duplex][] stream, `process.stdin` can also be used in "old" mode that
22582260
is compatible with scripts written for Node.js prior to v0.10.
22592261
For more information see [Stream compatibility][].
@@ -2262,50 +2264,6 @@ In "old" streams mode the `stdin` stream is paused by default, so one
22622264
must call `process.stdin.resume()` to read from it. Note also that calling
22632265
`process.stdin.resume()` itself would switch stream to "old" mode.
22642266

2265-
```js
2266-
process.stdin.setEncoding('utf8');
2267-
2268-
// 'readable' may be triggered multiple times as data is buffered in
2269-
process.stdin.on('readable', () => {
2270-
let chunk;
2271-
// Use a loop to make sure we read all currently available data
2272-
while ((chunk = process.stdin.read()) !== null) {
2273-
process.stdout.write(`data: ${chunk}`);
2274-
}
2275-
});
2276-
2277-
// 'end' will be triggered once when there is no more data available
2278-
process.stdin.on('end', () => {
2279-
process.stdout.write('end');
2280-
});
2281-
```
2282-
2283-
Each call to `stdin.read()` returns a chunk of data. The chunks are not
2284-
concatenated. A `while` loop is necessary to consume all data currently in the
2285-
buffer. When reading a large file `.read()` may return `null`, having
2286-
consumed all buffered content so far, but there is still more data to come not
2287-
yet buffered. In this case a new `'readable'` event will be emitted when there
2288-
is more data in the buffer. Finally the `'end'` event will be emitted when
2289-
there is no more data to come.
2290-
2291-
Therefore to read a file's whole contents from `stdin` you need to collect
2292-
chunks across multiple `'readable'` events, something like:
2293-
2294-
```js
2295-
var chunks = [];
2296-
2297-
process.stdin.on('readable', () => {
2298-
let chunk;
2299-
while ((chunk = process.stdin.read()) !== null) {
2300-
chunks.push(chunk);
2301-
}
2302-
});
2303-
2304-
process.stdin.on('end', () => {
2305-
let content = chunks.join('');
2306-
});
2307-
```
2308-
23092267
### `process.stdin.fd`
23102268

23112269
* {number}
@@ -2639,6 +2597,7 @@ cases:
26392597
[Event Loop]: https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/#process-nexttick
26402598
[LTS]: https://github.com/nodejs/Release
26412599
[Readable]: stream.html#stream_readable_streams
2600+
[`readable.read()`]: stream.html#stream_readable_read_size
26422601
[Signal Events]: #process_signal_events
26432602
[Stream compatibility]: stream.html#stream_compatibility_with_older_node_js_versions
26442603
[TTY]: tty.html#tty_tty

doc/api/stream.md

Lines changed: 35 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1118,17 +1118,48 @@ automatically until the internal buffer is fully drained.
11181118

11191119
```js
11201120
const readable = getReadableStreamSomehow();
1121+
1122+
// 'readable' may be triggered multiple times as data is buffered in
11211123
readable.on('readable', () => {
11221124
let chunk;
1125+
console.log('Stream is readable (new data received in buffer)');
1126+
// Use a loop to make sure we read all currently available data
11231127
while (null !== (chunk = readable.read())) {
1124-
console.log(`Received ${chunk.length} bytes of data.`);
1128+
console.log(`Read ${chunk.length} bytes of data...`);
11251129
}
11261130
});
1131+
1132+
// 'end' will be triggered once when there is no more data available
1133+
readable.on('end', () => {
1134+
console.log('Reached end of stream.');
1135+
});
11271136
```
11281137

1129-
The `while` loop is necessary when processing data with
1130-
`readable.read()`. Only after `readable.read()` returns `null`,
1131-
[`'readable'`][] will be emitted.
1138+
Each call to `readable.read()` returns a chunk of data, or `null`. The chunks
1139+
are not concatenated. A `while` loop is necessary to consume all data
1140+
currently in the buffer. When reading a large file `.read()` may return `null`,
1141+
having consumed all buffered content so far, but there is still more data to
1142+
come not yet buffered. In this case a new `'readable'` event will be emitted
1143+
when there is more data in the buffer. Finally the `'end'` event will be
1144+
emitted when there is no more data to come.
1145+
1146+
Therefore to read a file's whole contents from a `readable`, it is necessary
1147+
to collect chunks across multiple `'readable'` events:
1148+
1149+
```js
1150+
const chunks = [];
1151+
1152+
readable.on('readable', () => {
1153+
let chunk;
1154+
while (null !== (chunk = readable.read())) {
1155+
chunks.push(chunk);
1156+
}
1157+
});
1158+
1159+
readable.on('end', () => {
1160+
const content = chunks.join('');
1161+
});
1162+
```
11321163

11331164
A `Readable` stream in object mode will always return a single item from
11341165
a call to [`readable.read(size)`][stream-read], regardless of the value of the

0 commit comments

Comments
 (0)