Skip to content

When using a llmproxy to access model, the part of code remove the model prefix, but llm proxy need it. #910

Open
@franckOL

Description

@franckOL

Issue is round there:
https://github.com/openai/openai-agents-python/blob/91c62c1dea97d02ad4974947e572d634d42496d1/src/agents/models/multi_provider.py#L128C1-L144C77

For the point llmproxy need prefix, I tree

# in llmproxy\Lib\site-packages\agents\models\openai_chatcompletions.py
# If I add line 271, it works
        ret = await self._get_client().chat.completions.create(
            model="openai/" + self.model,

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions