Skip to content

fix(litellm): convert message content to strings for providers requiring plain text format #582

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

schnetler
Copy link

@schnetler schnetler commented Jul 31, 2025

Description

Fix message formatting issues when using LiteLLMModel with providers that require plain string content instead of content block arrays. These providers (OpenRouter, Ollama, Together AI, etc.) reject requests with validation errors like messages.1.content.0.text.text: Input should be a valid string.

Changes:

  • src/strands/models/litellm.py
    • Override format_request_tool_message – converts tool response content from arrays to plain strings
    • Override format_request_messages – converts assistant messages with tool_calls to use string/None content

No changes to base classes; only overrides in LiteLLMModel to handle provider-specific requirements.

Related Issues

N/A - Identified the issue and raised the PR

Documentation PR

N/A — Existing LiteLLM examples continue to work. The fix is transparent to users.

Type of Change

  • Bug fix

Testing

Verified the fix resolves the validation errors:

from strands.models.litellm import LiteLLMModel
from strands import Agent, tool

@tool
def add(x: int, y: int) -> int:
    return x + y

# Previously failed with validation error
model = LiteLLMModel(model_id="openrouter/anthropic/claude-3-haiku")
agent = Agent(model=model, tools=[add])
result = agent("What is 2+2?")  # ✅ Now works: "The sum of 2 + 2 is 4"

Also confirmed:

  • All unit tests pass (787 passed)

  • LiteLLM integration tests pass (Bedrock compatibility maintained)

  • Manually tested with OpenRouter: Claude 3 Haiku, Claude 3.5 Sonnet, GPT-4o Mini, Llama 3.3 70B, Nova Lite

  • No warnings or regressions in existing functionality

  • I ran hatch run prepare

Checklist

  • I have read the CONTRIBUTING document
  • I have added any necessary tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly, or no new docs are needed
  • I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

…a compatibility

Override format_request_tool_message and format_request_messages in LiteLLMModel
to convert Anthropic-style content blocks to OpenAI-compatible plain strings.

This fixes compatibility with LiteLLM providers that have strict OpenAI API
validation (OpenRouter, Ollama, Together AI, etc.) while maintaining
compatibility with AWS Bedrock.

Changes:
- Tool responses now send content as plain string instead of content blocks
- Assistant messages with tool_calls use string/None content instead of arrays

Fixes message format errors like:
- "messages.1.content.0.text.text: Input should be a valid string"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants