Skip to content

Support async call by return Mono<ChatResponse> #538

Open
@XhstormR

Description

@XhstormR

Please do a quick search on GitHub issues first, the feature you are about to request might have already been requested.

Expected Behavior

The call method of ChatClient supports returning Mono objects to achieve asynchronous calls.

For example:

public interface AsyncChatClient extends ModelClient<Prompt, ChatResponse> {
    @Override
    Mono<ChatResponse> call(Prompt prompt);
}

Current Behavior

Currently, the call method of AzureOpenAiChatClient does not support asynchronous calls. Only the stream method can return a Flux async object, but Flux represents multiple results. I need Mono object which it represent a single result.

Context

I used com.azure:azure-ai-openai SDK before. This SDK provides two type client: OpenAIClient and OpenAIAsyncClient. OpenAIAsyncClient provides complete Mono async support for a single result. But when I migrated to Spring AI, I found that Mono support was not provided.

image

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions