Skip to content

Conversation

@Chesars
Copy link
Contributor

@Chesars Chesars commented Dec 17, 2025

Relevant issues

Fixes #14753

Pre-Submission checklist

  • I have Added testing in the tests/litellm/ directory
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🐛 Bug Fix

Summary

presencePenalty` is a native Google Gemini API parameter as documented:

When using presence_penalty with Google AI Studio Gemini models via LiteLLM, it throws an UnsupportedParamsError:

import litellm

response = litellm.completion(
    model='gemini/gemini-2.0-flash',
    messages=[{'role': 'user', 'content': 'Hello'}],
    presence_penalty=0.5
)
# ERROR: litellm.UnsupportedParamsError: gemini does not support parameters: ['presence_penalty']

Solution

Added presence_penalty to the list of supported OpenAI params in GoogleAIStudioGeminiConfig.get_supported_openai_params().

The parameter mapping to presencePenalty (camelCase) already existed in the parent class VertexGeminiConfig, so only the supported params list needed updating.

Tests Added

  • test_google_ai_studio_presence_penalty_supported in tests/test_litellm/llms/vertex_ai/gemini/test_vertex_and_google_ai_studio_gemini.py

@vercel
Copy link

vercel bot commented Dec 17, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Review Updated (UTC)
litellm Ready Ready Preview, Comment Dec 17, 2025 11:36pm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Google Gemini Flash 2.0 - Parameter "presence_penalty" not supported

1 participant