Description
Problem Statement
I would like Strands to support non-streaming mode for OpenAI and compatible servers
Currently, when invoking any OpenAI compatible server with non-streaming mode, we encounter an error:
`model = OpenAIModel(
client_args={
"api_key" : <API_KEY>,
"base_url" : <BASE_URL>
},
model_id=<MODEL_ID>,
params={
"max_tokens": 1000,
"temperature": 0.7,
}
)
agent = Agent(model=model)
response = agent("What is 2+2")`
UnboundLocalError: cannot access local variable 'choice' where it is not associated with a value
It would be great if we add support similar to Bedrock where non streaming mode is supported

Proposed Solution
We can follow similar to Bedrock approach:
sdk-python/src/strands/models/bedrock.py
Line 371 in cc5be12
Use Case
To support any OpenAI models or Agentic usecases which doesn't need streaming support
Alternatives Solutions
No response
Additional Context
No response