Closed as not planned
Description
📌 Context
I'm currently working on integrating a custom LLM into my application. Specifically, I'm using Groq's llama3-8b-8192
model through the ChatOpenAI
class from the langchain_openai
package:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
openai_api_base="https://api.groq.com/openai/v1",
openai_api_key="gsk_XXXXXXXXXXXXXXXXXXXXXXXXXXXX",
model_name="llama3-8b-8192",
temperature=0,
max_tokens=1000,
)
I aim to integrate this model into my existing setup, which utilizes a custom ModelProvider
:
from openai import AsyncOpenAI
from agents import ModelProvider, Model, OpenAIChatCompletionsModel
client = AsyncOpenAI(base_url='', api_key='')
class CustomModelProvider(ModelProvider):
def get_model(self, model_name: str | None) -> Model:
return ChatOpenAI(
openai_api_base="https://api.groq.com/openai/v1",
openai_api_key="gsk_XXXXXXXXXXXXXXXXXXXXXXXXXXXX",
model_name="llama3-8b-8192",
temperature=0,
max_tokens=1000,
)
CUSTOM_MODEL_PROVIDER = CustomModelProvider()