Skip to content

CompletionTokensDetails can be nil (should be optional) #20

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions Sources/LLMChatOpenAI/ChatCompletion.swift
Original file line number Diff line number Diff line change
Expand Up @@ -162,14 +162,14 @@ public struct ChatCompletion: Decodable, Sendable {

public struct CompletionTokensDetails: Decodable, Sendable {
/// When using Predicted Outputs, the number of tokens in the prediction that appeared in the completion.
public let acceptedPredictionTokens: Int
public let acceptedPredictionTokens: Int?

/// When using Predicted Outputs, the number of tokens in the prediction that did not appear in the completion.
/// However, like reasoning tokens, these tokens are still counted in the total completion tokens for purposes of billing, output, and context window limits.
public let rejectedPredictionTokens: Int
public let rejectedPredictionTokens: Int?

/// Tokens generated by the model for reasoning.
public let reasoningTokens: Int
public let reasoningTokens: Int?

private enum CodingKeys: String, CodingKey {
case acceptedPredictionTokens = "accepted_prediction_tokens"
Expand Down
6 changes: 3 additions & 3 deletions Sources/LLMChatOpenAI/ChatCompletionChunk.swift
Original file line number Diff line number Diff line change
Expand Up @@ -154,14 +154,14 @@ public struct ChatCompletionChunk: Decodable, Sendable {

public struct CompletionTokensDetails: Decodable, Sendable {
/// When using Predicted Outputs, the number of tokens in the prediction that appeared in the completion.
public let acceptedPredictionTokens: Int
public let acceptedPredictionTokens: Int?

/// When using Predicted Outputs, the number of tokens in the prediction that did not appear in the completion.
/// However, like reasoning tokens, these tokens are still counted in the total completion tokens for purposes of billing, output, and context window limits.
public let rejectedPredictionTokens: Int
public let rejectedPredictionTokens: Int?

/// Tokens generated by the model for reasoning.
public let reasoningTokens: Int
public let reasoningTokens: Int?

private enum CodingKeys: String, CodingKey {
case acceptedPredictionTokens = "accepted_prediction_tokens"
Expand Down