-
Notifications
You must be signed in to change notification settings - Fork 823
Remove use of ConfigureAwait from Microsoft.Extensions.AI.dll for AIFunction invocations #6250
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…unction invocations We try to use ConfigureAwait(false) throughout our libraries. However, we exempt ourselves from that in cases where user code is expected to be called back from within the async code, and there's a reasonable presumption that such code might care about the synchronization context. AIFunction fits that bill. And FunctionInvokingChatClient needs to invoke such functions, which means that we need to be able to successfully flow the context from where user code calls Get{Streaming}ResponseAsync through into wherever a FunctionInvokingChatClient is in the middleware pipeline. We could try to selectively avoid ConfigureAwait(false) on the path through middleware that could result in calls to FICC.Get{Streaming}ResponseAsync, but that's fairly brittle and hard to maintain. Instead, this PR just removes ConfigureAwait use from the M.E.AI library. It also fixes a few places where tasks were explicitly being created and queued to the thread pool.
I think I need to better understand this exemption and its implications. For example, one could argue that |
It's the "and there's a reasonable presumption that such code might care about the synchronization context" part. |
And what decides that presumption I guess is the fact that both here and in S.L.AsyncEnumerable we're unambiguously passing a single delegate for execution? |
Yes. It's subjective, how likely is it that it'll really be an issue, that someone has a reasonable assumption their code will be invoked in a particular context. |
…unction invocations (dotnet#6250) We try to use ConfigureAwait(false) throughout our libraries. However, we exempt ourselves from that in cases where user code is expected to be called back from within the async code, and there's a reasonable presumption that such code might care about the synchronization context. AIFunction fits that bill. And FunctionInvokingChatClient needs to invoke such functions, which means that we need to be able to successfully flow the context from where user code calls Get{Streaming}ResponseAsync through into wherever a FunctionInvokingChatClient is in the middleware pipeline. We could try to selectively avoid ConfigureAwait(false) on the path through middleware that could result in calls to FICC.Get{Streaming}ResponseAsync, but that's fairly brittle and hard to maintain. Instead, this PR just removes ConfigureAwait use from the M.E.AI library. It also fixes a few places where tasks were explicitly being created and queued to the thread pool.
…ns.AI.dll for AIFunction... Remove use of ConfigureAwait from Microsoft.Extensions.AI.dll for AIFunction invocations (dotnet#6250) We try to use ConfigureAwait(false) throughout our libraries. However, we exempt ourselves from that in cases where user code is expected to be called back from within the async code, and there's a reasonable presumption that such code might care about the synchronization context. AIFunction fits that bill. And FunctionInvokingChatClient needs to invoke such functions, which means that we need to be able to successfully flow the context from where user code calls Get{Streaming}ResponseAsync through into wherever a FunctionInvokingChatClient is in the middleware pipeline. We could try to selectively avoid ConfigureAwait(false) on the path through middleware that could result in calls to FICC.Get{Streaming}ResponseAsync, but that's fairly brittle and hard to maintain. Instead, this PR just removes ConfigureAwait use from the M.E.AI library. It also fixes a few places where tasks were explicitly being created and queued to the thread pool.
We try to use ConfigureAwait(false) throughout our libraries. However, we exempt ourselves from that in cases where user code is expected to be called back from within the async code, and there's a reasonable presumption that such code might care about the synchronization context. AIFunction fits that bill. And FunctionInvokingChatClient needs to invoke such functions, which means that we need to be able to successfully flow the context from where user code calls Get{Streaming}ResponseAsync through into wherever a FunctionInvokingChatClient is in the middleware pipeline. We could try to selectively avoid ConfigureAwait(false) on the path through middleware that could result in calls to FICC.Get{Streaming}ResponseAsync, but that's fairly brittle and hard to maintain. Instead, this PR just removes ConfigureAwait use from the M.E.AI library. It also fixes a few places where tasks were explicitly being created and queued to the thread pool.
Microsoft Reviewers: Open in CodeFlow