Open
Description
What happened?
Direct Azure AI foundry endpoint model call:
curl --request POST \
--url https://my-model.eastus2.models.ai.azure.com/chat/completions \
--header 'Authorization: Bearer xxx' \
--data '{
"model": "phi-4",
"messages": [
{
"role": "user",
"content": "ping"
}
],
"max_tokens": 10
}'
successful response:
{
"id": "chatcmpl-691aa22b-766b-4431-82e9-b3acccb249f8",
"object": "chat.completion",
"created": 1748498263,
"model": "phi4",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "It looks like you're referencing the \"ping\"",
"tool_calls": [],
"reasoning_content": null
},
"finish_reason": "length"
}
],
"usage": {
"prompt_tokens": 8,
"total_tokens": 18,
"completion_tokens": 10,
"prompt_tokens_details": null
}
}
proxy-config.yaml
entry:
- model_name: phi-4
litellm_params:
model: azure_ai/my-model
api_base: https://my-model.eastus2.models.ai.azure.com
api_key: xxx
model_info:
mode: chat
region: "East US 2"
base_model: azure_ai/Phi-4
access_groups: ["default-models"]
health_check_timeout: 1
call via litellm proxy:
curl --request POST \
--url https://xxx.com/chat/completions \
--header 'api-key: xxx' \
--data '{
"model": "phi-4",
"messages": [
{
"role": "user",
"content": "ping"
}
],
"max_tokens": 10
}'
error response: 400 Bad Request
{
"error": {
"message": "litellm.BadRequestError: Azure_aiException - {\"error\":{\"code\":\"Invalid input\",\"status\":422,\"message\":\"invalid input error\",\"details\":[{\"type\":\"model_attributes_type\",\"loc\":[\"body\"],\"msg\":\"Input should be a valid dictionary or object to extract fields from\",\"input\":\"{\\\"model\\\": \\\"phi-4-dcai\\\", \\\"messages\\\": [{\\\"role\\\": \\\"user\\\", \\\"content\\\": \\\"ping\\\"}], \\\"stream\\\": false, \\\"max_tokens\\\": 10}\"}]}}. Received Model Group=phi-4\nAvailable Model Group Fallbacks=None",
"type": null,
"param": null,
"code": "400"
}
}
Expected: successful response via litellm proxy same as when directly calling azure ai foundry endpoint
Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
1.71.2.dev1
Twitter / LinkedIn details
No response