Replies: 1 comment
-
had the same thing happen with the same model via the Python SDK |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When you request to return the log probs as well (available on the openai chat api via the regular rest interface), the parameter is accepted but the response does not contain the logprobs.
Here is how I call it:
$response = $openaiClient->chat()->create([
'model' =>"gpt-4o-mini",
'messages' => $messages,
'temperature' => 0.7,
'max_tokens' => 50,
'n' => 5,
'logprobs' => true,
]);
I would have expected an extra array in the response object containing the log probs
Beta Was this translation helpful? Give feedback.
All reactions