Skip to content

Conversation

cryptekbits
Copy link

  • Introduced use_responses flag in ModelSettings to toggle the use of the OpenAI Responses API.
  • Implemented methods to adapt response streaming and extract text from Responses API output.
  • Updated send_completion method to handle Responses API requests and responses.
  • Added new model configurations for GPT-5-Codex with Responses API support in model-settings.yml.

- Introduced `use_responses` flag in ModelSettings to toggle the use of the OpenAI Responses API.
- Implemented methods to adapt response streaming and extract text from Responses API output.
- Updated `send_completion` method to handle Responses API requests and responses.
- Added new model configurations for GPT-5 with Responses API support in model-settings.yml.
@aguindehi
Copy link

Does gpt-5-codex work for you with this MR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants