-
Notifications
You must be signed in to change notification settings - Fork 231
Allow specifying JSON schema for chat completions #398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@Serializable | ||
public data class JsonSchema( | ||
/** | ||
* Optional name for the schema | ||
*/ | ||
@SerialName("name") val name: String? = null, | ||
|
||
/** | ||
* The JSON schema specification | ||
*/ | ||
@SerialName("schema") val schema: JsonObject, | ||
|
||
/** | ||
* Whether to enforce strict schema validation | ||
*/ | ||
@SerialName("strict") val strict: Boolean = true | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the format of the json schema specified in the OpenAI Structured Outputs guide
val request = chatCompletionRequest { | ||
model = ModelId("gpt-4o-mini-2024-07-18") | ||
responseFormat = jsonSchema(jsonSchema) | ||
messages { | ||
message { | ||
role = ChatRole.System | ||
content = "You are a helpful assistant.!" | ||
} | ||
message { | ||
role = ChatRole.System | ||
content = """All your answers should be a valid JSON | ||
""".trimMargin() | ||
} | ||
message { | ||
role = ChatRole.User | ||
content = "Who won the world cup in 1998?" | ||
} | ||
} | ||
} | ||
val response = openAI.chatCompletion(request) | ||
val content = response.choices.first().message.content.orEmpty() | ||
|
||
@Serializable | ||
data class Answer(val question: String? = null, val response: String? = null) | ||
|
||
val answer = Json.decodeFromString<Answer>(content) | ||
assertNotNull(answer.question) | ||
assertNotNull(answer.response) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that I do not specify the Answer
schema in any of the messages, but it still responds with an answer that parses correctly since I include the jsonSchema in the responseFormat.
This looks fantastic - @aallam Any reason not to merge & release this highly anticipated feature? |
LGTM! thanks a lot for your contribution! |
@aallam, would you mind pushing these changes into the 4.0 beta? |
+1 - @aallam Please, I could really do with access to these changes - they're critical for programmatic use cases of Chat completions - another Beta Tag release due? |
For the folks(including @chris-hatton-tipmi) who need this feature right now, here's the guide how to use the version // in build.gradle.kts
repositories {
// ...
// Please add this repository
maven { url = uri("https://oss.sonatype.org/content/repositories/snapshots/") }
}
dependencies {
// ...
// Please add this dependency
implementation("com.aallam.openai:openai-client:4.0.0-SNAPSHOT")
} |
Describe your change
Allow specifying JSON schema for chat completions. See the added unit test (jsonSchema in TestChatCompletions) for a simple example of how to use.
Note: This feature is only available for models 4o and later.
What problem is this fixing?
OpenAI's API supports specifying JSON schema for recent models. This library does not currently allow the user to specify this, and now it will.
One benefit is that (as can be seen in the added unit test), you do not have to specify the schema in the messages themselves now.