-
Notifications
You must be signed in to change notification settings - Fork 29.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
readline: buffer line events in async iterator #41708
Conversation
Beforehand the async iterator for readline was created lazily when the stream was accessed which meant no events were buffered. In addition stream handling was modernized and improved in the process to support backpressure correctly. A test was added to assert this. Fixes: nodejs#28565 Fixes: nodejs#33463
const onIterator = on(this, 'line'); | ||
this.once('close', () => onIterator.return()); | ||
const readable = Readable.from(onIterator).map((line) => line[0]); | ||
this[kLineObjectStream] = readable; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note the current implementation does not close the readline.Interface
when the async iterable is closed. This can be easily fixed if we want by doing readable.once('close', () => this.close())
though I am not sure which behavior is better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should close it. The pattern for async iterators is that the stream is fully consumed after the loop. Maybe add an option?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should close it too but was hesitant to change the existing behavior. If you agree it should be closed I'll close it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think so.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
(Note to self: This also needs a doc update to remove the sentence about not attaching the listener early enough) |
The failing for backpressure test looks kind of wrong I think? It wraps the events with a readable settings its highWaterMark to 16 (default for objectMode) and stops asking for data once we buffered 16 items by calling This means that other consumers of the this is like one consumer in a multicast system pausing the broadcast for everyone because it can't buffer enough. Alternatively I'm not sure backpressure makes a ton of sense here? I'll consider |
I'm probably lost, could you link?
If you are using an async iterator you are the primary consumer of that data. This should probably documented somewhere but a lot of unwanted things happen when multiple consumers are attached. |
There is a test that checks that we You can see the test in https://github.com/nodejs/node/blob/master/test/parallel/test-readline-async-iterators-backpressure.js It's kind of confusing since if errors happen those will be buffered too in the async iterable which is why this behavior changed in This isn't too hard to "fix" since |
I think I'll split this PR into two - one just fixing the bug and another to discuss backpressure to make the discussion easier - hope that's ok? |
go for it |
I want to wait for #41276 to land first |
|
||
await setImmediate(); | ||
const result = []; | ||
for await (const item of rl) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like it's redundant await
, isn't it?
This needs a rebase/ |
Beforehand the async iterator for readline was created lazily when the
stream was accessed which meant no events were buffered. In addition
stream handling was modernized and improved in the process to support
backpressure correctly. A test was added to assert this.
There is a cost here (the stream is always created) I believe it is well worth it to enable modern usage of
readline
without dropping lines.There is also an unrelated bugfix here (we return the wrong thing from
events.on(...).throw(...)
.cc @nodejs/readline @nodejs/streams @jfriend00
Fixes: #28565
Fixes: #33463