-
-
Notifications
You must be signed in to change notification settings - Fork 32.5k
Closed
Labels
memoryIssues and PRs related to the memory management or memory footprint.Issues and PRs related to the memory management or memory footprint.streamIssues and PRs related to the stream subsystem.Issues and PRs related to the stream subsystem.
Description
- Version: v15.6.0
- Platform: Darwin MBPM.local 20.2.0 Darwin Kernel Version 20.2.0: Wed Dec 2 20:39:59 PST 2020; root:xnu-7195.60.75~1/RELEASE_X86_64 x86_64
- Subsystem:
stream.pipeline
What steps will reproduce the bug?
- Launch the following script with
node --inspect-brk --expose-gc test.js
- Open a chrome inspector and play until the first
debugger
statement is reached (will auto pause). - Start an
Allocation timeline
profiling - Start running the script again (F8)
- When reaching the second
debugger
, stop the profiling - Notice
10
references to the Passthrough are still in memory
const { PassThrough, pipeline } = require('stream')
const { promisify } = require('util')
const nextTick = promisify(process.nextTick)
const consume = async (it) => {
for await (const _item of it) continue
}
async function* genItems(count) {
const source = async function* () {
// Generate way too many items
while (true) {
await nextTick() // simulate async
yield { foo: 'bar' }
}
}
const duplex = new PassThrough({ objectMode: true })
const stream = pipeline(source, duplex, (err) => {
console.log('done', err) // err.code === ERR_STREAM_PREMATURE_CLOSE
})
for await (const item of stream) {
yield item
if (--count <= 0) break
}
}
const test = async () => {
const stream = genItems(2)
await consume(stream)
}
const main = async () => {
debugger // <== Start memory allocation timeline profiling now
for (let i = 0; i < 10; i++) await test()
global.gc()
debugger // <== Stop memory profiling now
}
main()
How often does it reproduce? Is there a required condition?
Only occurs when breaking a loop on the async generator.
What is the expected behavior?
It should be possible to use a stream pipeline as an async generator without having memory leaks.
What do you see instead?
When profiling, references to the Passthrough still appear in memory:
Additional information
Might be linked to #35452
ivanbulanov and JHIH-LEI
Metadata
Metadata
Assignees
Labels
memoryIssues and PRs related to the memory management or memory footprint.Issues and PRs related to the memory management or memory footprint.streamIssues and PRs related to the stream subsystem.Issues and PRs related to the stream subsystem.