Skip to content

[BUG] IndexError: list index out of range in ollama_pt() when messages list is empty or malformed #2740

Closed
@SmartITCentre

Description

@SmartITCentre

Description

When running crewai run with uv and LiteLLM connected to an Ollama backend, an unhandled IndexError is thrown inside ollama_pt() due to an empty or improperly structured messages list. This cascades into an APIConnectionError, which obscures the root issue.

Environment:
Python: 3.12.9
Model Host: Msty
OS: Windows
Command: crewai run

Stack Trace
File "...\\litellm\\main.py", line 2870, in completion response = base_llm_http_handler.completion( File "...\\litellm\\llms\\custom_httpx\\llm_http_handler.py", line 269, in completion data = provider_config.transform_request( File "...\\litellm\\llms\\ollama\\completion\\transformation.py", line 322, in transform_request modified_prompt = ollama_pt(model=model, messages=messages) File "...\\litellm_core_utils\\prompt_templates\\factory.py", line 229, in ollama_pt tool_calls = messages[msg_i].get("tool_calls") IndexError: list index out of range

Steps to Reproduce

Set up a CrewAI pipeline with the following agents:
1. Agent 1: Analyzes the user query
2. Agent 2: Searches a Qdrant vector database
3. Agent 3: Gathers the search results and returns a response

Run the project using:
crewai run

Ensure the messages list (used internally in LiteLLM) is either empty or improperly constructed.

Observe that the LLM call fails with an IndexError, ultimately throwing a misleading APIConnectionError.

Expected behavior

Task should be completed successfully.

Screenshots/Code snippets

Image

Operating System

Windows 10

Python Version

3.12

crewAI Version

v0.118.0

crewAI Tools Version

0.43.0

Virtual Environment

Conda

Evidence

`╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────── LLM Error ────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ │
│ ❌ LLM Call Failed │
│ Error: litellm.APIConnectionError: list index out of range │
│ Traceback (most recent call last): │
│ File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\main.py", line 2870, in completion │
│ response = base_llm_http_handler.completion( │
│ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ │
│ File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 269, in completion │
│ data = provider_config.transform_request( │
│ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ │
│ File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\ollama\completion\transformation.py", line 322, in transform_request │
│ modified_prompt = ollama_pt(model=model, messages=messages) │
│ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ │
│ File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\litellm_core_utils\prompt_templates\factory.py", line 229, in ollama_pt │
│ tool_calls = messages[msg_i].get("tool_calls") │
│ ~~~~~~~~^^^^^^^ │
│ IndexError: list index out of range │
│ │
│ │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

2025-05-02 13:05:45,577 - 12064 - llm.py-llm:903 - ERROR: LiteLLM call failed: litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\main.py", line 2870, in completion
response = base_llm_http_handler.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 269, in completion
data = provider_config.transform_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\ollama\completion\transformation.py", line 322, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\litellm_core_utils\prompt_templates\factory.py", line 229, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range

Error during LLM call: litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\main.py", line 2870, in completion
response = base_llm_http_handler.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 269, in completion
data = provider_config.transform_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\ollama\completion\transformation.py", line 322, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\litellm_core_utils\prompt_templates\factory.py", line 229, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range

An unknown error occurred. Please check the details below.
Error details: litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\main.py", line 2870, in completion
response = base_llm_http_handler.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 269, in completion
data = provider_config.transform_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\llms\ollama\completion\transformation.py", line 322, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AWS_Crew\security.venv\Lib\site-packages\litellm\litellm_core_utils\prompt_templates\factory.py", line 229, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range`

Possible Solution

Make QdrantSearch tool to use local llm for semantic search.

Additional context

Because IndexError isn’t caught early, it gets raised as a misleading APIConnectionError, complicating debugging when using orchestrated agent workflows like CrewAI. Better validation of the messages list before transformation would help developers using structured multi-agent systems.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions