-
-
Notifications
You must be signed in to change notification settings - Fork 5.1k
fix(anthropic): preserve server_tool_use and web_search_tool_result in multi-turn conversations #17746
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(anthropic): preserve server_tool_use and web_search_tool_result in multi-turn conversations #17746
Conversation
…n multi-turn conversations - Extract web_search_tool_result blocks in extract_response_content() - Store web_search_results in provider_specific_fields for round-trip - Detect srvtoolu_ prefix to reconstruct as server_tool_use (not tool_use) - Add corresponding web_search_tool_result after server_tool_use blocks This ensures multi-turn conversations with Anthropic web search + custom tools work correctly without Anthropic expecting tool_result for server- side tool executions.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
@Chesars can you please ensure this covers streaming too? |
|
@krrishdholakia @Chesars streaming is likely failing because of #17254 Here is an example: you can see |
|
I can confirm that this is the only issue. For example postprocessing # Convert streamed chunks to complete message
full_response = litellm.stream_chunk_builder(chunks)
content = full_response.choices[0].message.content
tool_calls = full_response.choices[0].message.tool_calls or []
for tc in tool_calls:
if tc.function.arguments:
tc.function.arguments = tc.function.arguments.replace('{}', '') |
…tions Fixes BerriAI#18137 Similar to the fix for web_search_tool_result (BerriAI#17746, BerriAI#17798), this PR preserves web_fetch_tool_result blocks in multi-turn conversations. Changes: - Add handling for web_fetch_tool_result in transformation.py (non-streaming) - Add capture of web_fetch_tool_result in handler.py (streaming) - Fix streaming tool arguments bug where empty input {} was prepended to actual arguments by using empty string instead of str({}) - Add unit tests for web_fetch_tool_result handling
Relevant issues
Fixes #17737
Pre-Submission checklist
tests/litellm/directorymake test-unitType
🐛 Bug Fix
Changes
Bug 1:
web_search_tool_resultis droppedWhen Anthropic returns web search results, LiteLLM was ignoring that field.
Example request
Anthropic returns:
{"type": "web_search_tool_result", "tool_use_id": "srvtoolu_01ABC", "content": [...]}LiteLLM returned to user: Nothing. search results were lost.
Fix: Extract
web_search_tool_resultand include it inprovider_specific_fields.web_search_results.Bug 2:
server_tool_usereconstructed astool_useWhen the user sends messages back for a multi-turn conversation, LiteLLM was converting server-side tool calls to regular tool calls.
User sends to LiteLLM:
{"tool_calls": [{"id": "srvtoolu_01ABC", "function": {"name": "web_search", ...}}]}LiteLLM sent to Anthropic (before fix):
{"type": "tool_use", "id": "srvtoolu_01ABC", "name": "web_search", ...}❌ Anthropic requires
tool_resultfor everytool_use, but the user can't provide one for server-executed tools.LiteLLM sends to Anthropic (after fix):
{"type": "server_tool_use", "id": "srvtoolu_01ABC", "name": "web_search", ...} {"type": "web_search_tool_result", "tool_use_id": "srvtoolu_01ABC", "content": [...]}✅ Block types.
Fix: Convert
srvtoolu_prefix and reconstruct asserver_tool_use+web_search_tool_result.Files changed
litellm/llms/anthropic/chat/transformation.py- Extractweb_search_tool_resultlitellm/litellm_core_utils/prompt_templates/factory.py- Reconstructserver_tool_usetests/test_litellm/llms/anthropic/chat/test_anthropic_chat_transformation.py- 3 new teststests/llm_translation/test_prompt_factory.py- 5 new tests