-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stream destruction breaks async-iteration and ends nodejs event loop #23730
Comments
@nodejs/streams |
Simplified repro: (async () => {
var stream = require('fs').createReadStream(__filename);
stream.destroy();
await stream[Symbol.asyncIterator]().next();
console.log('Done');
})().catch(console.log); The node/lib/internal/streams/async_iterator.js Lines 143 to 152 in 05394d2
|
Definitely it's a bug, as @Hakerh400 showed. @mika-fischer the event loop is shutting down because there is nothing asynchronous left to do. If you add the following: setInterval(() => {
console.log('a second passed')
}, 1000) The application will keep running |
See #23785 for a fix. |
Fixed in 3ec8cec. |
Async iteration over a readable stream breaks horribly if for any reason the stream is destroyed while the iteration is still running (or even if it is destroyed before starting the iteration). I would expect that the iteration runs normally or throws an exception. What happens instead is that the whole event loop is shut down.
Most notably this makes async iteration useless with
pipeline
.Test code:
Output:
The text was updated successfully, but these errors were encountered: