-
-
Notifications
You must be signed in to change notification settings - Fork 4.3k
feat(databricks): add User-Agent partner attribution support #15392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat(databricks): add User-Agent partner attribution support #15392
Conversation
@jatorre is attempting to deploy a commit to the CLERKIEAI Team on Vercel. A member of the Team first needs to authorize it. |
|
||
# Priority 3: Check litellm_settings (global proxy config) | ||
if not partner and litellm_params: | ||
settings = litellm_params.get("litellm_settings", {}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
global settings are at the module level - litellm.<attribute>
version = metadata.get("databricks_product_version") | ||
|
||
# Priority 2: Check optional_params (model-level config) | ||
if not partner and optional_params: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
optional params will handle both per request and model level values
version = None | ||
|
||
# Priority 1: Check metadata parameter (per-request) | ||
if litellm_params: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should not be needed. optional params handles both
did you manually qa this? it looks wrong |
Adds support for ISV partner attribution via Databricks SDK User-Agent headers, enabling proper identification and tracking of requests from partner integrations. Implementation follows existing LiteLLM patterns (e.g., AWS Bedrock) with 2-priority system: 1. optional_params (per-request and model-level config) 2. Environment variables (preserved if pre-set) Configuration examples: Per-request: litellm.completion( model="databricks/...", databricks_partner="carto", databricks_product="agentic-gis", databricks_product_version="1.0.0" ) Model-level (proxy config): model_list: - model_name: my-model litellm_params: model: databricks/... databricks_partner: carto databricks_product: agentic-gis databricks_product_version: 1.0.0 Changes: - 20 lines implementation code - 28 lines test code - 2 essential tests (happy path + env vars preserved) - Follows Databricks SDK conventions 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
d5b798e
to
9aa811f
Compare
Sorry for the initial messy PR! I've now studied how other providers (AWS Bedrock, Azure, etc.) handle similar configurations and simplified significantly. What ChangedRemoved unnecessary complexity:
Current implementation:
Usage: # Per-request or model-level (via optional_params)
litellm.completion(
model="databricks/databricks-meta-llama-3-1-70b-instruct",
messages=[{"role": "user", "content": "Hello"}],
databricks_partner="carto",
databricks_product="agentic-gis",
databricks_product_version="1.0.0"
) |
Summary
Adds support for ISV partner attribution via Databricks SDK User-Agent headers, enabling proper identification and tracking of requests from partner integrations.
Problem
Databricks requires ISV partners to identify their requests via User-Agent for proper attribution and analytics, as documented in the Databricks SDK.
This is currently not configurable in LiteLLM, preventing ISV partners from meeting Databricks compliance requirements.
Solution
Added multi-tier configuration system with clear priority hierarchy:
litellm_params
litellm_settings
Follows existing Vertex AI labels pattern for consistency
Backwards compatible (all new parameters are optional)
Works seamlessly with proxy and Responses API
Configuration Examples
Global proxy config (recommended for proxy deployments):
Per-request metadata (for dynamic scenarios):
Model-level configuration:
Environment variables (fallback):
Benefits
Testing
Files Changed
litellm/llms/databricks/common_utils.py
- Core implementationlitellm/llms/databricks/chat/transformation.py
- Integration pointtests/test_litellm/llms/databricks/test_databricks_common_utils.py
- Test coverageRelated
This addresses the ISV integration requirements documented in Databricks SDK and enables proper partner attribution for any organization using LiteLLM as a proxy to Databricks Foundation Model APIs.