Skip to content

[BUG] o4-mini usage raising stop parameter not support exception #2661

Open
@tspecht

Description

@tspecht

Description

When attempting to use o4-mini as the Model for an agent, the OpenAI API returns an error indicating that the stop parameter is not supported.

Stacktrace:

File "/usr/local/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 208, in _invoke_loop
    raise e
  File "/usr/local/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 155, in _invoke_loop
    answer = get_llm_response(
             ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/crewai/utilities/agent_utils.py", line 157, in get_llm_response
    raise e
  File "/usr/local/lib/python3.12/site-packages/crewai/utilities/agent_utils.py", line 148, in get_llm_response
    answer = llm.call(
             ^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/crewai/llm.py", line 794, in call
    return self._handle_non_streaming_response(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/agents/utilities.py", line 90, in _handle_non_streaming_response
    return super()._handle_non_streaming_response(params, callbacks, available_functions)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/crewai/llm.py", line 630, in _handle_non_streaming_response
    response = litellm.completion(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 1154, in wrapper
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 1032, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/main.py", line 3068, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2201, in exception_type
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 326, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}

Code that adds the erroneous parameter:

"stop": self.stop,

Solution:
By removing the stop parameter in a custom LLM subclass the issue goes away. Example:

class LLMWithFixedStopWords(LLM):
       def _handle_streaming_response(self, params, callbacks, available_functions):
           if "o4-mini" in self.model:
               params.pop("stop", None)
           return super()._handle_streaming_response(params, callbacks, available_functions)

       def _handle_non_streaming_response(self, params, callbacks, available_functions):
           if "o4-mini" in self.model:
               params.pop("stop", None)
           return super()._handle_non_streaming_response(params, callbacks, available_functions)

Steps to Reproduce

  1. Create any agent that uses o4-mini as the model
  2. Run the agent with any task

Expected behavior

Agent runs successfully

Screenshots/Code snippets

See in description

Operating System

Ubuntu 20.04

Python Version

3.11

crewAI Version

0.114

crewAI Tools Version

N/A

Virtual Environment

Venv

Evidence

See description

Possible Solution

See description

Additional context

See description

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions