Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

Fix: Allow new Gemini models to route to native provider without constants

Summary

Fixes #3949 where LLM(model='google/gemini-3-pro-preview') was failing to route to the native Gemini provider because the model wasn't in the GEMINI_MODELS constants list.

Changes:

  • Modified LLM._validate_model_in_constants() to allow any model starting with gemini-, gemma-, or learnlm- prefixes to route to the native Gemini provider, even if not explicitly listed in constants
  • This prevents breakage when Google releases new Gemini models before the constants list is updated
  • The Google SDK will surface a clear error if a model truly doesn't exist
  • Added 4 comprehensive tests covering:
    • New preview models routing correctly (including gemini-3-pro-preview)
    • Case-insensitive prefix validation
    • Fallback to LiteLLM for non-matching models
    • Existing models still working correctly

Technical approach:
Instead of strictly requiring models to be in GEMINI_MODELS, the validation now:

  1. First checks if the model is in the constants list (fast path for known models)
  2. If not found, checks if the model name starts with known Gemini prefixes (case-insensitive)
  3. Falls back to LiteLLM for models that don't match any prefix

This approach is more future-proof and aligns with how Azure models are handled (which also don't have strict validation).

Review & Testing Checklist for Human

⚠️ Risk Level: YELLOW - Unable to test locally due to environment issues, but changes follow existing patterns

  • Verify the fix resolves issue [BUG] Gemini 3.0 Pro Preview doesn't work #3949: Test that LLM(model='google/gemini-3-pro-preview') now routes to GeminiCompletion instead of falling back to LiteLLM
  • Test existing Gemini models: Verify that known models like google/gemini-2.0-flash-001 and google/gemini-1.5-pro still work correctly
  • Test invalid model handling: Verify that LLM(model='google/unknown-model-xyz') properly falls back to LiteLLM or surfaces an appropriate error
  • Review CI test results: Ensure all new tests pass, especially test_gemini_allows_new_preview_models_without_constants
  • Consider long-term approach: Evaluate whether prefix-based validation is the right strategy vs. maintaining an exhaustive constants list

Test Plan

# Test 1: Verify the reported issue is fixed
llm = LLM(model='google/gemini-3-pro-preview')
assert llm.__class__.__name__ == "GeminiCompletion"
assert llm.provider == "gemini"

# Test 2: Verify existing models still work
llm = LLM(model='google/gemini-2.0-flash-001')
assert isinstance(llm, GeminiCompletion)

# Test 3: Verify invalid models fall back appropriately
llm = LLM(model='google/unknown-model-xyz')
assert llm.is_litellm == True

Notes

  • Unable to run tests locally: The uv.lock file had parsing errors, preventing local test execution. Changes were verified through code analysis and pattern matching with existing code.
  • Scope limited to Gemini: This change only affects Gemini model routing. OpenAI, Anthropic, and Bedrock providers maintain strict constant validation.
  • Case-insensitive matching: The implementation uses .lower() for prefix matching, which should handle various capitalizations (e.g., "Gemini-3-Pro", "GEMINI-3-FLASH").

Link to Devin run: https://app.devin.ai/sessions/eb9b5faf184046608c23eb6de4e7a067
Requested by: João (joao@crewai.com)

…e provider

- Relaxed Gemini model validation to allow any model starting with gemini-, gemma-, or learnlm- prefixes
- This prevents breakage when new Gemini models are released
- The Google SDK will surface a clear error if a model truly doesn't exist
- Added comprehensive tests for new preview models, case-insensitive validation, and fallback behavior
- Fixes issue #3949

Co-Authored-By: João <joao@crewai.com>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Gemini 3.0 Pro Preview doesn't work

1 participant