Description
What version of gRPC and what language are you using?
C#, v.2.67.0
What operating system (Linux, Windows,...) and version?
dotnet/aspnet:8.0 docker image
What runtime / compiler are you using (e.g. .NET Core SDK version dotnet --info
)
.NET SDK 9.0.200
What did you do?
Sending multiple concurrent client-streaming RPCs from the same client to a service rate-limited with a ConcurrencyLimiter can cause a deadlock, which delays or prevents requests from being processed.
This issue occurs when a client-streaming RPC is configured with a ConcurrencyLimiter, using Microsoft.AspNetCore.RateLimiting:
Program.cs:
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddGrpc();
builder.Services.AddRateLimiter(rateLimiterOptions => rateLimiterOptions
.AddConcurrencyLimiter(policyName: "concurrency", options =>
{
options.PermitLimit = 1;
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = 100;
}));
var app = builder.Build();
app.UseRateLimiter();
// Configure the HTTP request pipeline.
app.MapGrpcService<GreeterService>();
app.MapGet("/", () => "Communication with gRPC endpoints must be made through a gRPC client. To learn how to create a client, visit: https://go.microsoft.com/fwlink/?linkid=2086909");
await app.RunAsync();
GreeterService.cs:
public class GreeterService : Greeter.GreeterBase
{
private readonly ILogger<GreeterService> _logger;
public GreeterService(ILogger<GreeterService> logger)
{
_logger = logger;
}
[EnableRateLimiting("concurrency")]
public override async Task<HelloReply> SayHello(IAsyncStreamReader<HelloRequest> requestStream, ServerCallContext context)
{
await foreach (var chunk in requestStream.ReadAllAsync())
{
await Task.Delay(1);
}
return new HelloReply
{
Message = "Hello"
};
}
}
Calling the service with 2 concurrent requests on a single connection is sufficient to trigger the deadlock:
Client Program.cs
var builder = WebApplication.CreateBuilder(args);
builder.Services
.AddGrpcClient<Greeter.GreeterClient>((services, options) =>
{
options.Address = new Uri("https://grpc-server:8081");
});
var app = builder.Build();
var client = app.Services.GetService<Greeter.GreeterClient>();
async Task SendCall(Greeter.GreeterClient client)
{
var name = Path.GetRandomFileName();
using var call = client.SayHello();
for (int i = 0; i < 100; i++)
{
await call.RequestStream.WriteAsync(new HelloRequest { Name = name });
}
await call.RequestStream.CompleteAsync();
await call.ResponseAsync;
Console.WriteLine($"Greeting returned for {name}");
}
// Both calls deadlock and never finish
await Task.WhenAll(Enumerable.Repeat(client, 2).Select(SendCall));
As far as I can tell, this is caused by GPRC multiplexing both requests on a single connection; as streamed messages are interleaved, call 1 (the active request) will wait on requestStream.ReadAllAsync() until the preceding messages for call 2 have been read. However, call 2 is queued by the ConcurrencyLimiter until call 1 has completed. As a result no progress can be made on call 1 and the service deadlocks.
This issue is simple to reproduce with 2 requests and maximum concurrency 1, however any combination of clients where all active requests are multiplexed with queued requests will result in a deadlock.
This issue can be worked around by setting MaxStreamsPerConnection = 1 to prevent multiplexing.
I'm not sure if there's a good solution to this problem; it seems like an unavoidable consequence of limiting parallelism with channels sharing a connection. However, I wasn't able to find any previous discussion of this issue. As the setup is fairly minimal, it would be good if such a deadlock was detectable and at least issued a warning.
What did you expect to see?
No deadlock, or a warning that requests had deadlocked.
What did you see instead?
RPC service deadlocked and did not process requests further, with no warning logged.