Closed
Description
Describe the bug
When running agents (simple agent with model gpt-4o-mini) using:
- OpenAIChatCompletionsModel
- AsyncOpenAI client
I'm facing next error:
openai.BadRequestError: Error code: 400 - {'error': {'message': "The 'stream_options' parameter is only allowed when 'stream' is enabled.", 'type': 'invalid_request_error', 'param': 'stream_options', 'code': None}}
Actually, not sure if this happens only in docker. I didn't test this outside.
However, this error isn't present in version 0.0.9
Debug information
- Agents SDK version: 0.0.10
- Python version Python 3.10
Repro steps
import asyncio
from agents import OpenAIChatCompletionsModel, Agent, Runner
from openai import AsyncOpenAI
from utils.config import config
async def main():
client = AsyncOpenAI(api_key=config.openai_api_key)
model = OpenAIChatCompletionsModel(model='gpt-4o-mini', openai_client=client)
agent = Agent(
name="Test agent",
instructions="Test agent",
model=model
)
raw = await Runner.run(
agent,
str("test")
)
print("Full result of calling '%s': %s", agent.name, raw)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
FROM python:3.10
RUN mkdir -p /test/src
WORKDIR /test/src
COPY requirements.txt /test/src
RUN pip install --no-cache-dir -r requirements.txt
COPY ./src /test/src
CMD ["python", "main.py"]
Expected behavior
This flow should work without failures