Skip to content

Conversation

@Sameerlite
Copy link
Collaborator

Title

Update new anthropic feats as reviewed

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

  • map output_config={"effort"..} to OpenAI's reasoning_effort parameter
  • support anthropic models on Microsoft Foundry - add cost tracking for this
  • automatically inject relevant beta headers for tool_search, code_execution across Anthropic API, Bedrock, VertexAI,
  • ignore input_examples in tool call when calling non-anthropic models

@vercel
Copy link

vercel bot commented Nov 26, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
litellm Ready Ready Preview Comment Nov 27, 2025 5:17pm

effort_used = self.is_effort_used(optional_params, model)

# Add beta headers based on provider
if custom_llm_provider in ["vertex_ai", "vertex_ai_beta"]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we not have this logic be inside a function like this but rather be an override - handled by the provider specific implementations?

this will be cleaner and easier to maintain

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

@Sameerlite Sameerlite merged commit 6c326ce into main Nov 28, 2025
37 of 59 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants