-
Notifications
You must be signed in to change notification settings - Fork 4.1k
Description
Describe the bug
api-key
header is not being sent when using the AddOpenAIChatCompletion
Kernel extensions. Instead an Authorization: Bearer *
header is being sent which does not work with Azure AI Foundry deployments - they require an api-key
header.
To Reproduce
Steps to reproduce the behavior:
- Register an OpenAI chat client:
services.AddOpenAIChatCompletion(
modelId: modelId,
serviceId: modelId,
openAIClient: new OpenAIClient(
credential: new ApiKeyCredential(
apiKey
),
options: new OpenAIClientOptions
{
Endpoint = new Uri(endpoint)
}
)
);
- Obtain it and send a request:
var chatModel = _semanticKernel.GetRequiredService<IChatCompletionService>(chatModelInfo.ModelId);
var response = await chatModel.GetChatMessageContentAsync(
chatHistory: _chatHistory,
kernel: _Kernel);
- The request will fail with a
401 Unauthorized
Expected behavior
api-key
header should be sent
Platform
- Language: C#
- Source: v. 1.3.6
- AI model: Phi-3-small-8k-instruct
Additional context
Obviously it can be worked around with a custom HTTP client that's sending the correct header, but this being a Microsoft-supported package it should integrate with Microsoft products seamlessly.
The URL https://.services.ai.azure.com/models is passed to the endpoint parameter.
From the official docs it would seem that this is supported out-of-the-box, but it's not. https://learn.microsoft.com/bg-bg/azure/ai-foundry/model-inference/concepts/endpoints?tabs=csharp#routing
Metadata
Metadata
Assignees
Labels
Type
Projects
Status