Skip to content

## Custom Model Provider Not Working #485

Closed as not planned
Closed as not planned
@Sghosh1999

Description

@Sghosh1999

📌 Context

I'm currently working on integrating a custom LLM into my application. Specifically, I'm using Groq's llama3-8b-8192 model through the ChatOpenAI class from the langchain_openai package:

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    openai_api_base="https://api.groq.com/openai/v1",
    openai_api_key="gsk_XXXXXXXXXXXXXXXXXXXXXXXXXXXX",
    model_name="llama3-8b-8192",
    temperature=0,
    max_tokens=1000,
)

I aim to integrate this model into my existing setup, which utilizes a custom ModelProvider:

from openai import AsyncOpenAI
from agents import ModelProvider, Model, OpenAIChatCompletionsModel

client = AsyncOpenAI(base_url='', api_key='')

class CustomModelProvider(ModelProvider):
    def get_model(self, model_name: str | None) -> Model:
        return ChatOpenAI(
                      openai_api_base="https://api.groq.com/openai/v1",
                      openai_api_key="gsk_XXXXXXXXXXXXXXXXXXXXXXXXXXXX",
                      model_name="llama3-8b-8192",
                      temperature=0,
                      max_tokens=1000,
                  )

CUSTOM_MODEL_PROVIDER = CustomModelProvider()

Please guide how to do it.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingneeds-more-infoWaiting for a reply/more info from the authorstale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions