Skip to content

[FEATURE] Support non-streaming mode for OpenAI model #245

Open
@nandateja

Description

@nandateja

Problem Statement

I would like Strands to support non-streaming mode for OpenAI and compatible servers

Currently, when invoking any OpenAI compatible server with non-streaming mode, we encounter an error:

`model = OpenAIModel(
client_args={
"api_key" : <API_KEY>,
"base_url" : <BASE_URL>
},
model_id=<MODEL_ID>,
params={
"max_tokens": 1000,
"temperature": 0.7,
}
)

agent = Agent(model=model)
response = agent("What is 2+2")`

UnboundLocalError: cannot access local variable 'choice' where it is not associated with a value

It would be great if we add support similar to Bedrock where non streaming mode is supported

Image

Proposed Solution

We can follow similar to Bedrock approach:

def _convert_non_streaming_to_streaming(self, response: dict[str, Any]) -> Iterable[StreamEvent]:

Use Case

To support any OpenAI models or Agentic usecases which doesn't need streaming support

Alternatives Solutions

No response

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions