Open
Description
For the concurrency in the other helpers to be useful, you have to call .next()
multiple times and buffer the results as they come in.
We should have a helper which does that for you, giving you an async generator which reads from the buffer and keeps the buffer full. I'm tentatively calling it bufferAhead
but open to other names.
That would let you do stuff like
let pages = asyncIteratorOfUrls
.map(u => fetch(u))
.bufferAhead(2);
for await (let page of pages) {
console.log(await examine(page));
}
And even without using helpers like map
you could still make some use of it with raw async generators and other async iterables when the processing of the data you get from the iterable is async, as in
// run the `examine` step concurrently with fetching the next item from the buffer
for await (let item of asyncGenerator().bufferAhead(1)) {
console.log(await examine(item));
}
I note that such a helper exists in the userland iter-tools library, spelled asyncBuffer
.
Open questions:
- Should it start buffering as soon as you make the helper, or only once you do the first read from it?
- When someone calls
.next()
and gets an item which hasn't yet settled, should that trigger another pull from the underlying iterator, or should the parameter tobufferAhead
serve as an upper bound on the degree of concurrency to request from the underlying iterator (as well as the lower bound it obviously is)? (I lean towards "it should let you pull more things if you explicitly ask for them"; we should have other mechanisms for limiting concurrency.)
Metadata
Metadata
Assignees
Labels
No labels