fix: handle dangling tool_use blocks for LiteLLM proxy compatibility#8497
fix: handle dangling tool_use blocks for LiteLLM proxy compatibility#8497hendem wants to merge 3 commits intoanomalyco:devfrom
Conversation
When sessions are compacted or interrupted, tool calls may be left in pending/running state without corresponding tool_result blocks. This causes Anthropic/Claude APIs (especially through LiteLLM proxies) to reject requests with validation errors. This change: 1. Converts pending/running tool calls to error results in toModelMessage to ensure every tool_use has a corresponding tool_result 2. Adds a dummy _noop tool when message history contains tool calls but no tools are provided (required by LiteLLM proxies) Fixes anomalyco#8246 Fixes anomalyco#2915
|
The following comment was made by an LLM, it may be inaccurate: Based on my search, I found one potential duplicate/related PR: Related PR FoundPR #8248: fix: avoid Anthropic tool-history requests without tools Why it's related: This PR addresses a very similar issue - handling requests that contain tool calls/history but lack the Also note: PR #3243 mentioned in the description ("Fix: Each tool_use block must have a corresponding tool_result block in the next message") is the stale/conflicted PR that #8497 is intended to supersede/fix properly. |
There was a problem hiding this comment.
Pull request overview
This PR fixes session compaction failures when using OpenCode with LiteLLM proxies by ensuring every tool call has a corresponding tool result and that the tools parameter is present when needed.
Changes:
- Pending and running tool calls now generate error tool results to prevent dangling tool_use blocks
- A dummy
_nooptool is added when message history contains tool calls but no tools are provided - Test coverage added for pending/running tool call conversion
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| packages/opencode/src/session/message-v2.ts | Adds handling for pending/running tool states to convert them to error results |
| packages/opencode/src/session/llm.ts | Adds dummy tool for LiteLLM proxy compatibility and helper function to detect tool calls |
| packages/opencode/test/session/message-v2.test.ts | Adds test coverage for pending/running tool call error result generation |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Re: Related PR #8248Thanks for flagging the potential duplicate. I've reviewed PR #8248 and here's how this PR differs: PR #8248 only addresses Issue 1 (missing
This PR (#8497) addresses both issues:
The second fix is critical because even with the dummy tool, if there are dangling I'd suggest either closing #8248 in favor of this more comprehensive fix, or we could cherry-pick the provider-specific check from #8248 if the team prefers a more targeted approach for Issue 1. |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 4 out of 4 changed files in this pull request and generated no new comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
I've created #8658 which builds on this PR and addresses @rekram1-node's feedback from #8248. The key change: Restricts Detection logic: const isLiteLLMProxy =
provider.options?.["litellmProxy"] === true ||
input.model.providerID.toLowerCase().includes("litellm") ||
input.model.api.id.toLowerCase().includes("litellm")Why this matters:
Config example for custom gateways: {
"provider": {
"my-gateway": {
"options": { "litellmProxy": true }
}
}
}Happy to discuss merging this into your PR or keeping them separate. |
00637c0 to
71e0ba2
Compare
f1ae801 to
08fa7f7
Compare
|
What's the blocker of this PR? I need this to be merged. |
Problem
When using OpenCode with LiteLLM proxies (e.g., for Vertex AI, Bedrock), session compaction fails with:
or:
This affects users who route Anthropic/Claude API calls through corporate LiteLLM proxies and cannot modify proxy settings (e.g.,
modify_params=True).Root Cause Analysis
There are two distinct issues causing these errors:
Issue 1: Missing
toolsparameter during compactionWhen compaction runs, it calls
processor.process()withtools: {}(compaction.ts:149):LiteLLM validates that if message history contains
tool_use/tool_resultblocks, thetoolsparameter must be present. While Anthropic's native API now handles this gracefully, LiteLLM enforces stricter validation.Issue 2: Dangling
tool_useblocks withouttool_resultWhen a session is interrupted or aborted, tool calls may be left in
pendingorrunningstate. ThetoModelMessage()function inmessage-v2.tscurrently skips these incomplete tool calls:This creates
tool_useblocks in the assistant message without correspondingtool_resultblocks, which Anthropic/Claude APIs strictly reject.Solution
Fix 1: Add dummy
_nooptool when message history contains tool callsIn
llm.ts, detect when messages contain tool calls but no tools are provided, and inject a placeholder tool:The
_nooptool is excluded fromactiveToolsso it won't be suggested or called by the LLM.Fix 2: Convert pending/running tool calls to error results
In
message-v2.ts, handlependingandrunningtool states by generating errortool_resultblocks:This ensures every
tool_usehas a correspondingtool_result, satisfying Anthropic's API requirements.Files Changed
packages/opencode/src/session/llm.tshasToolCalls()helper and dummy_nooptool injectionpackages/opencode/src/session/message-v2.tspackages/opencode/test/session/message-v2.test.tsTesting
Related
Fixes #8246
Fixes #2915
Related to #3243 (stale PR with merge conflicts addressing similar ordering issue)
Prior Art
modify_params=Trueworkaround, but many corporate proxy deployments cannot change this setting