A set of extensions for optimizing/simplifying System.Threading.Channels usage.
Click here for detailed documentation.
With optional concurrency levels.
- Reading all entries in a channel.
- Writing all entries from a source to a channel.
- Piping (consuming) all entries to a buffer (channel).
.AsAsyncEnumerable()
(IAsyncEnumerable
) support for .NET Standard 2.1+ and .NET Core 3+
Filter
: reads from the channel until a match is found.Transform
: applies a transform function upon successfully reading an item from the channel.Batch
: attempts to group items into aList<T>
(or aQueue<T>
) before being available for reading.Join
: combines batches into a single channel.
Install-Package Open.ChannelExtensions
Being able to define an asynchronous pipeline with best practice usage using simple expressive syntax:
await Channel
.CreateBounded<T>(10)
.SourceAsync(source /* IEnumerable<Task<T>> */)
.PipeAsync(
maxConcurrency: 2,
capacity: 5,
transform: asyncTransform01)
.Pipe(transform02, /* capacity */ 3)
.ReadAllAsync(finalTransformedValue => {
// Do something async with each final value.
});
await source /* IEnumerable<T> */
.ToChannel(boundedSize: 10, singleReader: true)
.PipeAsync(asyncTransform01, /* capacity */ 5)
.Pipe(
maxConcurrency: 2,
capacity: 3,
transform: transform02)
.ReadAll(finalTransformedValue => {
// Do something with each final value.
});
await channel.ReadAll(
entry => { /* Processing Code */ });
await channel.ReadAll(
(entry, index) => { /* Processing Code */ });
await channel.ReadAllAsync(
async entry => { await /* Processing Code */ });
await channel.ReadAllAsync(
async (entry, index) => { await /* Processing Code */ });
await channel.ReadAllConcurrently(
maxConcurrency,
entry => { /* Processing Code */ });
await channel.ReadAllConcurrentlyAsync(
maxConcurrency,
async entry => { await /* Processing Code */ });
If complete
is true
, the channel will be closed when the source is empty.
// source can be any IEnumerable<T>.
await channel.WriteAll(source, complete: true);
// source can be any IEnumerable<Task<T>> or IEnumerable<ValueTask<T>>.
await channel.WriteAllAsync(source, complete: true);
// source can be any IEnumerable<Task<T>> or IEnumerable<ValueTask<T>>.
await channel.WriteAllConcurrentlyAsync(
maxConcurrency, source, complete: true);
Both of these extensions operate synchronously after an item is read from the channel.
Any predicate or selector function must trap errors of the downstream read will fail and data may not be recoverable.
// Filter and transform when reading.
channel.Reader
.Filter(predicate) // .Where()
.Transform(selector) // .Select()
.ReadAllAsync(async value => {/*...*/});
values.Reader
.Batch(10 /*batch size*/) // Groups into List<T>.
.WithTimeout(1000) // Any non-empty batches are flushed every second.
.ReadAllAsync(async batch => {/*...*/});
The inverse of batching.
batches.Reader
.Join() // Combines the batches into a single channel.
.ReadAllAsync(async value => {/*...*/});
// Transform values in a source channel to new unbounded channel.
var transformed = channel.Pipe(
async value => /* transformation */);
// Transform values in a source channel to new unbounded channel with a max concurrency of X.
const int X = 4;
var transformed = channel.Pipe(
X, async value => /* transformation */);
// Transform values in a source channel to new bounded channel bound of N entries.
const int N = 5;
var transformed = channel.Pipe(
async value => /* transformation */, N);
// Transform values in a source channel to new bounded channel bound of N entries with a max concurrency of X.
const int X = 4;
const int N = 5;
var transformed = channel.Pipe(
X, async value => /* transformation */, N);
// or
transformed = channel.Pipe(
maxConcurrency: X,
capacity: N,
transform: async value => /* transformation */);