Skip to content

Python: Bug: AzureResponsesAgent with reasoning model doesn't work #12843

@angangwa

Description

@angangwa

Describe the bug
When using a reasoning model such as o4-mini withAzureResponsesAgent and tool calls, the agent errors out with both store_enabled as True or False. Likely because the previous repose id is not being passed or encrypted reasoning content is not sent, respectively.

To Reproduce
Here is the Python code to reproduce the error:

client = AsyncOpenAI(
    api_key=os.getenv("AZURE_REASONING_API_KEY"),
    base_url="https://<TODO>.cognitiveservices.azure.com/openai/v1/", ## https://<just-replace-this>.cognitiveservices.azure.com/openai/v1/
    default_query={"api-version": "preview"}, 
)
from typing import Annotated

from semantic_kernel.agents import AzureResponsesAgent
from semantic_kernel.contents import AuthorRole, FunctionCallContent, FunctionResultContent
from semantic_kernel.contents.chat_message_content import ChatMessageContent
from semantic_kernel.functions import kernel_function


# Define a sample plugin for the sample
class MenuPlugin:
    """A sample Menu Plugin used for the concept sample."""

    @kernel_function(description="Provides a list of specials from the menu.")
    def get_specials(self) -> Annotated[str, "Returns the specials from the menu."]:
        return """
        Special Soup: Clam Chowder
        Special Salad: Cobb Salad
        Special Drink: Chai Tea
        """

    @kernel_function(description="Provides the price of the requested menu item.")
    def get_item_price(
        self, menu_item: Annotated[str, "The name of the menu item."]
    ) -> Annotated[str, "Returns the price of the menu item."]:
        return "$9.99"


MESSAGES = []
async def handle_intermediate_steps(message: ChatMessageContent) -> None:
    MESSAGES.append(message)
    for item in message.items or []:
        if isinstance(item, FunctionResultContent):
            print(f"Function Result:> {item.result} for function: {item.name}")
        elif isinstance(item, FunctionCallContent):
            print(f"Function Call:> {item.name} with arguments: {item.arguments}")
        else:
            print(f"{item}")


async def main():
    # 1. Create the client using Azure OpenAI resources and configuration
    # client = AzureResponsesAgent.create_client()

    # 2. Create a Semantic Kernel agent for the OpenAI Responses API
    agent = AzureResponsesAgent(
        ai_model_id="o4-mini",  # Replace with your model deployment name
        client=client,
        instructions="Answer questions about the menu.",
        name="Host",
        plugins=[MenuPlugin()],
        reasoning_effort="high",  # Set the reasoning effort level
        store_enabled=False,  # Set to True if you want to store the responses in Azure
    )


    # 3. Create a thread for the agent
    # If no thread is provided, a new thread will be
    # created and returned with the initial response
    thread = None

    user_inputs = ["Hello", "What is the special soup?", "What is the special drink?", "How much is that?", "Thank you"]

    try:
        for user_input in user_inputs:
            print(f"# {AuthorRole.USER}: '{user_input}'")
            async for response in agent.invoke(
                messages=user_input,
                thread=thread,
                on_intermediate_message=handle_intermediate_steps,
            ):
                thread = response.thread
                print(f"# {response.name}: {response.content}")
    finally:
        await thread.delete() if thread else None


await main() # or use asyncio.

NOTE that the error for store_enabled=False and store_enabled=True are different.

When true: AgentExecutionException: ("<class 'semantic_kernel.agents.open_ai.azure_responses_agent.AzureResponsesAgent'> failed to complete the request", BadRequestError('Error code: 400 - {\'error\': {\'message\': "Item \'fc_688ca9068...\' of type \'function_call\' was provided without its required \'reasoning\' item: \'rs_688ca90...\'.", \'type\': \'invalid_request_error\', \'param\': \'input\', \'code\': None}}'))

When false: AgentExecutionException: ("<class 'semantic_kernel.agents.open_ai.azure_responses_agent.AzureResponsesAgent'> failed to complete the request", BadRequestError("Error code: 400 - {'error': {'message': 'No tool output found for function call call_ExGo1....', 'type': 'invalid_request_error', 'param': 'input', 'code': None}}"))

Expected behavior
The conversation history should be manged properly when tools are involved.

Screenshots
If applicable, add screenshots to help explain your problem.

Platform

  • Language: Python
  • Source: semantic-kernel==1.35.0
  • AI model: Azure OpenAI o4-mini

Metadata

Metadata

Assignees

Labels

agentsbugSomething isn't workingpythonPull requests for the Python Semantic Kernel

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions