Skip to content

Migrate OpenAI models away from max_tokens to max_completion_tokens #1206

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Mar 22, 2025

Conversation

barapa
Copy link
Contributor

@barapa barapa commented Mar 21, 2025

max_tokens is deprecated and not supported across all OpenAI models.

See https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_tokens

Fixes #1205

@Kludex
Copy link
Member

Kludex commented Mar 22, 2025

We need to work better on what we return when there's a finish reason.

@Kludex Kludex enabled auto-merge (squash) March 22, 2025 09:41
@Kludex Kludex merged commit 79faa27 into pydantic:main Mar 22, 2025
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Setting max_tokens for openai's o3-mini model throws 400 error
2 participants