-
Notifications
You must be signed in to change notification settings - Fork 4.2k
feat: add openai responses api capability #862
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: add openai responses api capability #862
Conversation
@matthewmichel is attempting to deploy a commit to the Vercel Team on Vercel. A member of the Team first needs to authorize it. |
New and updated dependencies detected. Learn more about Socket for GitHub ↗︎
|
the timeline of this pr is the reason why open source is the future. |
Sort and sweet. Love it! |
@tobiasbueschel Changes made to not store messages on the OpenAI side. Open to any other suggestions/feedback. |
@tobiasbueschel @matthewmichel Any chance this could be merged soon? LiteLLM now provides ResponsesAPI endpoint for many LLMs, not just OpenAI, and this would be great to have in chat-sdk by default. |
Does it support content annotations such as |
Updates
ai
and@ai-sdk/openai
versions to support the new OpenAI Responses API.Updates the default "small" and "large" models to use the Responses API.
Adds a new
webSearchPreview
tool call to demo the new OpenAI Web Search Preview tool.