Skip to content

Conversation

@Chesars
Copy link
Contributor

@Chesars Chesars commented Dec 17, 2025

Relevant issues

Fixes #16340

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

Add opt-in support for Gemini's native responseJsonSchema parameter (Gemini 2.0+ models).

What is responseJsonSchema?

Gemini API supports two schema formats for structured outputs:

  • responseSchema (OpenAPI format) - uppercase types, no additionalProperties
  • responseJsonSchema (JSON Schema format) - standard format, supports additionalProperties

Benefits of responseJsonSchema:

  • Standard JSON Schema format (lowercase types like string, object)
  • Supports additionalProperties: false for stricter validation
  • Better compatibility with Pydantic's model_json_schema()
  • No propertyOrdering required

Usage:

response = litellm.completion(
    model="gemini/gemini-2.0-flash",
    messages=[{"role": "user", "content": "..."}],
    response_format={
        "type": "json_schema",
        "json_schema": {
            "schema": {
                "type": "object",
                "properties": {"name": {"type": "string"}},
                "additionalProperties": False,  # Now works!
            }
        },
        "use_json_schema": True,  # Opt-in to responseJsonSchema
    }
)

Backwards Compatible:

  • Without use_json_schema: True, behavior is unchanged (uses responseSchema)
  • For older models (Gemini 1.5), falls back to responseSchema with a warning

Files Changed:

  • litellm/types/llms/vertex_ai.py - Added response_json_schema to GenerationConfig
  • litellm/llms/vertex_ai/common_utils.py - Added supports_response_json_schema() and _build_json_schema()
  • litellm/llms/vertex_ai/gemini/vertex_and_google_ai_studio_gemini.py - Modified transformation to support opt-in
  • tests/.../test_vertex_and_google_ai_studio_gemini.py - Added 2 tests

Test

  • test_vertex_ai_response_json_schema_opt_in - Verifies opt-in works for Gemini 2.0+
  • test_vertex_ai_response_json_schema_fallback_for_old_models - Verifies fallback for older models
  • 51 tests pass

Add support for Gemini's native responseJsonSchema parameter which uses
standard JSON Schema format instead of OpenAPI-style responseSchema.

Benefits of responseJsonSchema (Gemini 2.0+ only):
- Standard JSON Schema format (lowercase types)
- Supports additionalProperties for stricter validation
- Better compatibility with Pydantic's model_json_schema()
- No propertyOrdering required

Usage:
```python
response_format={
    "type": "json_schema",
    "json_schema": {"schema": {...}},
    "use_json_schema": True  # opt-in
}
```

This is backwards compatible - existing code continues to use
responseSchema by default.

Closes BerriAI#16340
@vercel
Copy link

vercel bot commented Dec 17, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Review Updated (UTC)
litellm Ready Ready Preview, Comment Dec 17, 2025 8:24pm

Document the new use_json_schema option for Gemini 2.0+ models
in the JSON Mode documentation.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature]: Support Gemini API's response_json_schema

1 participant