Skip to content

Conversation

@lbijeau
Copy link

@lbijeau lbijeau commented Feb 10, 2026

Summary

  • convertToOpenAIChatMessages now sends content: null instead of content: "" for assistant messages that only contain tool calls and no text content
  • Per the OpenAI API spec, content on assistant messages is nullable — their examples show content: null when the assistant only makes tool calls
  • The type definition (ChatCompletionAssistantMessage) already declared content?: string | null, so this aligns the runtime behavior with the type

Motivation

Sending content: "" causes issues with OpenAI-compatible providers (e.g. Ollama) where the empty string is rendered differently by model templates. Specifically, with Ollama + qwen3-coder, the empty string in prior assistant messages causes the model to switch from structured tool_calls to text-based markup (<function=name>) on subsequent turns, breaking multi-turn tool calling.

Verified via curl testing that the same conversation with content: null keeps the model in structured tool-calling mode, while content: "" causes mode switching.

Also filed on the Ollama side: ollama/ollama#14181

Test plan

  • Updated existing test: tool-call-only assistant messages now expect content: null
  • Added test: tool-call-only assistant messages produce content: null
  • Added test: assistant messages with both text and tool calls preserve text in content
  • All 21 tests in convert-to-openai-chat-messages.test.ts pass

Fixes #12389

…only assistant messages

Per the OpenAI API spec, `content` on assistant messages is nullable.
When an assistant turn only contains tool calls and no text, the
content field should be `null` rather than an empty string.

Sending `content: ""` causes issues with OpenAI-compatible providers
(e.g. Ollama) where the empty string is rendered differently by model
templates, causing models like qwen3-coder to switch from structured
tool_calls to text-based markup on subsequent turns.

Fixes vercel#12389

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
lbijeau added a commit to lbijeau/meloqui that referenced this pull request Feb 10, 2026
… SDK and fix content hygiene

Replaces the native Ollama tool-calling loop with Vercel AI SDK's
generateText/streamText, adds a fetch-level workaround for the empty-content
bug that causes qwen3-coder to drop structured tool_calls, and introduces
streaming code-block re-prompt detection for models that output tools as
markdown.

Upstream fixes pending:
- vercel/ai#12390 (SDK sends content:"" instead of null)
- ollama/ollama#14182 (Ollama /v1 layer doesn't normalise "")

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

convertToOpenAIChatMessages sends content: "" instead of content: null for tool-call-only assistant messages

1 participant