Skip to content

Python: Cannot include kernel kwarg in semantic_function #11418

Closed
@tjprescott

Description

@tjprescott

Describe the bug
Following the documentation here:
https://learn.microsoft.com/en-us/semantic-kernel/concepts/ai-services/chat-completion/function-calling/?pivots=programming-language-python#reserved-parameter-names-for-auto-function-calling

I'm trying to include the kernel argument so that it will automatically be passed into my @kernel_function.

    @kernel_function(
        name="search_guidelines",
        description="Semantic search against the guidelines with natural language query.",
    )
    def search_guidelines(
        self,
        query: str,
        programming_language: str,
        kernel: Kernel
    ) -> str:

However, whenever I do this, the following code blows up:

        result = await chat_completion.get_chat_message_content(
            chat_history=history,
            settings=execution_settings,
            kernel=kernel,
        )

I get the following errors:

Traceback (most recent call last):
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\semantic_kernel\connectors\ai\open_ai\services\open_ai_handler.py", line 87, in _send_completion_request
    response = await self.client.chat.completions.create(**settings_dict)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\openai\resources\chat\completions\completions.py", line 2000, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\openai\_base_client.py", line 1767, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\openai\_base_client.py", line 1461, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\openai\_base_client.py", line 1492, in _request
    request = self._build_request(options, retries_taken=retries_taken)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\openai\lib\azure.py", line 67, in _build_request
    return super()._build_request(options, retries_taken=retries_taken)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\openai\_base_client.py", line 506, in _build_request
    return self._client.build_request(  # pyright: ignore[reportUnknownMemberType]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\httpx\_client.py", line 378, in build_request
    return Request(
           ^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\httpx\_models.py", line 408, in __init__
    headers, stream = encode_request(
                      ^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\httpx\_content.py", line 216, in encode_request
    return encode_json(json)
           ^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\site-packages\httpx\_content.py", line 177, in encode_json
    body = json_dumps(
           ^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\json\__init__.py", line 238, in dumps
    **kw).encode(obj)
          ^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\json\encoder.py", line 200, in encode
    chunks = self.iterencode(o, _one_shot=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\json\encoder.py", line 258, in iterencode
    return _iterencode(o, 0)
           ^^^^^^^^^^^^^^^^^
  File "C:\Users\trpresco\.pyenv\pyenv-win\versions\3.12.9\Lib\json\encoder.py", line 180, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type _PydanticGeneralMetadata is not JSON serializable

According to the documentation, this is supposed to "just work".

Expected behavior
Should pass the kernel object into my @kernel_function

Platform

  • Language: Python
  • Source: semantic-kernel 1.27.2
  • AI model: Azure OpenAI: GPT-4o-mini
  • IDE: VS Code
  • OS: Windows

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingpythonPull requests for the Python Semantic Kernel

Type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions