Skip to content

fix: non default temperature check #2134

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

lovets18
Copy link
Contributor

Change
Added check for models o3-mini, o4-mini, o3

Reason
Models o3-mini, o4-mini, o3 has temperature, but they don't support setting it to non default value, so I encountered error:

BadRequestError: Error code: 400 - {
    'error’: {
        'message': "Unsupported value: 'temperature' does not support 1E-8 with this model. Only the default (1) value is supported.", 
        'type': 'invalid_request_error’, 
        'param': 'temperature’, 
        'code': 'unsupported_value’
    }
}

@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Jul 23, 2025
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Greptile Summary

This PR adds temperature restriction handling for specific OpenAI models (o3-mini, o4-mini, o3) that only support default temperature values. The change introduces a hardcoded list MODELS_NOT_SUPPORT_TEMP containing these model names and modifies the LangchainLLMWrapper class to check against this list before setting temperature parameters.

The fix addresses a specific API limitation where newer OpenAI models throw a BadRequestError when temperature is set to non-default values (like the 1e-8 that Ragas uses for deterministic outputs). The implementation adds conditional checks in both synchronous and asynchronous text generation methods (generate_text and agenerate_text) that only set the temperature if the model is not in the restricted list.

This change fits into the broader Ragas architecture by extending the existing LLM wrapper functionality to handle model-specific API constraints. The fix maintains backward compatibility while enabling support for these newer OpenAI models without requiring users to modify their code.

Confidence score: 3/5

  • This PR addresses a real API limitation but has implementation concerns that could lead to inconsistent behavior
  • The score reflects potential issues with the conditional logic structure and maintainability of hardcoded model lists
  • Files that need more attention: ragas/src/ragas/llms/base.py - the temperature restriction logic should be reviewed for consistency across different LLM implementations

The main concerns are:

  1. Inconsistent behavior: The temperature restriction is only applied when both temperature and model_name attributes exist, which could create different behavior across LLM implementations
  2. Maintainability: The hardcoded model list may become outdated as OpenAI releases new models with similar restrictions
  3. Missing coverage: The fix only applies to LangchainLLMWrapper but other LLM implementations in the codebase might face the same issue

1 file reviewed, 1 comment

Edit Code Review Bot Settings | Greptile

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:S This PR changes 10-29 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant