Skip to content

Add azure open ai support to simulator model config #346

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jul 24, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 29 additions & 13 deletions src/platform/endpoint/test/node/openaiCompatibleEndpoint.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ export type IModelConfig = {
id: string;
name: string;
version: string;
type: 'openai' | 'azureOpenai';
useDeveloperRole: boolean;
capabilities: {
supports: {
parallel_tool_calls: boolean;
Expand Down Expand Up @@ -64,7 +66,7 @@ export class OpenAICompatibleTestEndpoint extends ChatEndpoint {
is_chat_fallback: false,
capabilities: {
type: 'chat',
family: 'openai',
family: modelConfig.type === 'azureOpenai' ? 'azure' : 'openai',
tokenizer: TokenizerType.O200K,
supports: {
parallel_tool_calls: modelConfig.capabilities.supports.parallel_tool_calls,
Expand Down Expand Up @@ -105,26 +107,40 @@ export class OpenAICompatibleTestEndpoint extends ChatEndpoint {
throw new Error(`API key environment variable ${this.modelConfig.apiKeyEnvName} is not set`);
}

if (this.modelConfig.type === 'azureOpenai') {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be in a separate azure endpoint? Just in case they continue to diverge? Maybe via na override. You can see BYOK does this

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what do you mean by na override?
I was on the side of keeping it all in one, and create a separate azure endpoint if it diverges a lot. I'm still unsure of how much more if-elsing is going to be needed to be able to know the right separations (probably will have more knowledge as we continue to try more types of models).

return {
"api-key": apiKey,
"Content-Type": "application/json",
};
}

return {
"Authorization": `Bearer ${apiKey}`,
"Content-Type": "application/json",
};
}

override interceptBody(body: IEndpointBody | undefined): void {
if (!body) {
return;
}
const newMessages = body.messages!.map((message: CAPIChatMessage) => {
if (message.role === OpenAI.ChatRole.System) {
return { role: 'developer' as OpenAI.ChatRole.System, content: message.content };
super.interceptBody(body);
if (this.modelConfig.type === 'azureOpenai') {
if (body) {
delete body.snippy;
delete body.intent;
}
return message;
});
Object.keys(body).forEach(key => delete (body as any)[key]);
body.model = this.modelConfig.id; //TODO: is id the right field?
body.messages = newMessages;
body.stream = false;
}

if (this.modelConfig.useDeveloperRole && body) {
const newMessages = body.messages!.map((message: CAPIChatMessage) => {
if (message.role === OpenAI.ChatRole.System) {
return { role: 'developer' as OpenAI.ChatRole.System, content: message.content };
}
return message;
});
Object.keys(body).forEach(key => delete (body as any)[key]);
body.model = this.modelConfig.id; //TODO: is id the right field?
body.messages = newMessages;
body.stream = false;
}
}

override async acceptChatPolicy(): Promise<boolean> {
Expand Down
9 changes: 9 additions & 0 deletions test/simulationMain.ts
Original file line number Diff line number Diff line change
Expand Up @@ -807,6 +807,8 @@ function parseModelConfigFile(modelConfigFilePath: string): IModelConfig[] {
"<model id>": {
"name": "<model name>",
"version": "<model version>",
"type": "<model type>", // 'openai' or 'azureOpenai'
"useDeveloperRole": <boolean>, // optional, defaults to false
"capabilities": {
"supports"?: {
"parallel_tool_calls"?: <boolean>,
Expand Down Expand Up @@ -847,6 +849,11 @@ function parseModelConfigFile(modelConfigFilePath: string): IModelConfig[] {
}
checkProperty(model, 'name', 'string');
checkProperty(model, 'version', 'string');
checkProperty(model, 'type', 'string');
if (model.type !== 'openai' && model.type !== 'azureOpenai') {
throw new Error(`Model type '${model.type}' is not supported. Only 'openai' and 'azureOpenai' are allowed.`);
}
checkProperty(model, 'useDeveloperRole', 'boolean', true);
checkProperty(model, 'capabilities', 'object');
checkProperty(model.capabilities, 'supports', 'object', true);
if (model.capabilities.supports) {
Expand All @@ -867,6 +874,8 @@ function parseModelConfigFile(modelConfigFilePath: string): IModelConfig[] {
id: modelId,
name: model.name,
version: model.version,
type: model.type,
useDeveloperRole: model.useDeveloperRole ?? false,
capabilities: {
supports: {
parallel_tool_calls: model.capabilities.supports.parallel_tool_calls ?? false,
Expand Down