fix: stop leaking x-litellm-api-key to Anthropic + support OAuth tokens in passthrough#20432
fix: stop leaking x-litellm-api-key to Anthropic + support OAuth tokens in passthrough#20432klaudworks wants to merge 1 commit intoBerriAI:mainfrom
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Greptile OverviewGreptile SummaryThis PR addresses critical security and functionality issues in the Anthropic passthrough endpoint: Security Fix:
Credential Handling Improvements:
Implementation: Confidence Score: 4/5
|
| Filename | Overview |
|---|---|
| litellm/passthrough/utils.py | Added x-litellm-api-key to the list of headers that should not be forwarded to upstream providers, fixing the security leak |
| litellm/proxy/pass_through_endpoints/llm_passthrough_endpoints.py | Implemented credential priority logic for Anthropic passthrough: client headers take precedence over server credentials, avoiding literal "None" string |
| tests/test_litellm/proxy/pass_through_endpoints/test_passthrough_endpoints_common_utils.py | Added unit tests verifying x-litellm-api-key is correctly stripped from forwarded headers |
Sequence Diagram
sequenceDiagram
participant Client
participant Proxy as LiteLLM Proxy
participant PassThrough as anthropic_proxy_route
participant Utils as forward_headers_from_request
participant Anthropic as Anthropic API
Note over Client,Anthropic: Scenario 1: Client provides x-api-key
Client->>Proxy: POST /anthropic/v1/messages<br/>Headers: x-litellm-api-key, x-api-key
Proxy->>PassThrough: Forward request
PassThrough->>PassThrough: Check headers<br/>x_api_key_header exists
PassThrough->>PassThrough: custom_headers = {}
PassThrough->>Utils: forward_headers_from_request()<br/>strips x-litellm-api-key
Utils-->>PassThrough: Headers without x-litellm-api-key
PassThrough->>Anthropic: Request with client x-api-key<br/>(no server credentials added)
Anthropic-->>Client: Response
Note over Client,Anthropic: Scenario 2: Client provides Authorization (OAuth)
Client->>Proxy: POST /anthropic/v1/messages<br/>Headers: x-litellm-api-key, Authorization
Proxy->>PassThrough: Forward request
PassThrough->>PassThrough: Check headers<br/>auth_header exists
PassThrough->>PassThrough: custom_headers = {}
PassThrough->>Utils: forward_headers_from_request()<br/>strips x-litellm-api-key
Utils-->>PassThrough: Headers with Authorization
PassThrough->>Anthropic: Request with OAuth token<br/>(no server credentials added)
Anthropic-->>Client: Response
Note over Client,Anthropic: Scenario 3: No client credentials
Client->>Proxy: POST /anthropic/v1/messages<br/>Headers: x-litellm-api-key only
Proxy->>PassThrough: Forward request
PassThrough->>PassThrough: Check headers<br/>no x-api-key or Authorization
PassThrough->>PassThrough: Get server credentials
PassThrough->>PassThrough: custom_headers = server credentials
PassThrough->>Utils: forward_headers_from_request()<br/>strips x-litellm-api-key
Utils-->>PassThrough: Headers without x-litellm-api-key
PassThrough->>Anthropic: Request with server credentials
Anthropic-->>Client: Response
tests/test_litellm/proxy/pass_through_endpoints/test_passthrough_endpoints_common_utils.py
Show resolved
Hide resolved
|
the linting failure is in a file that I did not touch: |
|
@ishaan-jaff I'm seeing a lot of open PRs without any comments. Do you actually process external PRs? o.w. I'll release my fork. |
|
@klaudworks i just merged in #21039 which deals with the OAuth issue can you refactor to just handle the x-litellm-api-key scenario? |
…ream providers Prevent x-litellm-api-key (LiteLLM's virtual key) from being leaked to upstream providers when _forward_headers=True is used in passthrough endpoints.
4a8170d to
27863e3
Compare
|
@krrishdholakia I adapted the PR |
|
@greptile please review this |
Greptile OverviewGreptile SummaryThis PR adds a security fix to strip
Confidence Score: 4/5
|
| Filename | Overview |
|---|---|
| litellm/passthrough/utils.py | Single-line security fix: strips x-litellm-api-key from forwarded headers to prevent leaking LiteLLM proxy authentication credentials to upstream providers. The change is correct and placed in the right location alongside existing header stripping for content-length and host. |
| tests/test_litellm/proxy/pass_through_endpoints/test_passthrough_endpoints_common_utils.py | Adds two unit tests for header stripping in BasePassthroughUtils.forward_headers_from_request: one verifying x-litellm-api-key removal, another verifying host and content-length removal. Tests are mock-only (no network calls), which is correct per repository rules. Missing newline at end of file. |
Sequence Diagram
sequenceDiagram
participant Client
participant Proxy as LiteLLM Proxy
participant FwdHeaders as forward_headers_from_request
participant Anthropic as Anthropic API
Client->>Proxy: Request with headers including x-litellm-api-key
Proxy->>Proxy: Build custom_headers with server credentials
Proxy->>FwdHeaders: request_headers, custom_headers, forward_headers=True
Note over FwdHeaders: Strip content-length
Note over FwdHeaders: Strip host
Note over FwdHeaders: Strip x-litellm-api-key (NEW)
Note over FwdHeaders: Merge request + custom headers
FwdHeaders-->>Proxy: Merged headers without x-litellm-api-key
Proxy->>Anthropic: Forward request with safe headers
Last reviewed commit: 27863e3
|
|
||
| assert "host" not in result | ||
| assert "content-length" not in result | ||
| assert result.get("content-type") == "application/json" No newline at end of file |
There was a problem hiding this comment.
Missing newline at end of file
The file is missing a trailing newline, which can cause issues with some tools and produces a noisy diff marker. Add a newline at the end.
| assert result.get("content-type") == "application/json" | |
| assert result.get("content-type") == "application/json" |
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
|
@krrishdholakia do you still plan to merge this? |
|
@krrishdholakia you asked me to modify this after separately merging a subset of this PR which was created AFTER this PR. I pulled out the oauth changes which you explicitly asked for so please review this or close the PR |
|
@krrishdholakia ping. I'd be glad if I could use an upstream litellm image without leaking my litellm api keys to Anthropic. |
Summary
This PR fixes credential handling in the
/anthropic/{endpoint:path}passthrough endpoint. There are two issues here - one is leaking litellm api keys to anthropic, the other is headers being improperly overwritten.Security Fix
x-litellm-api-keywas being forwarded to Anthropic. Now we strip it inforward_headers_from_request.Bug Fixes
x-api-keywas always overwritten - Server credentials overwrote client-provided API keysAuthorizationheader was overwritten - OAuth tokens (used by Claude Code Max subscriptions) were ignored so this didn't work: https://docs.litellm.ai/docs/tutorials/claude_code_max_subscription"None"sent as API key - If server had noANTHROPIC_API_KEYconfigured, it sent the string"None"to AnthropicCredential Priority (new behavior)
x-api-key→ forward as-isAuthorization→ forward as-isVerification
x-litellm-api-keyis no longer forwarded to upstream providers