Skip to content

MCP server with custom OpenAI client #470

Closed as not planned
Closed as not planned
@AnaMariaIlie

Description

@AnaMariaIlie

Hello,

I'm trying to run mcp/filesystem_example with a custom OpenAI client, but I get some errors related to the API key (when I use it without MCP server, it works). I'm not sure that the agent is using my custom client. Do I set it well?

`import asyncio
import os
import shutil

from agents import Agent, Runner, gen_trace_id, trace
from agents.mcp import MCPServer, MCPServerStdio

from openai import OpenAI
from agents import set_default_openai_client

async def run(mcp_server: MCPServer):
agent = Agent(
name="Assistant",
instructions="Use the tools to read the filesystem and answer questions based on those files.",
mcp_servers=[mcp_server],
model="openai/gpt-4o-mini"
)

# List the files it can read
# Ask about books
# Ask a question that reads then reasons.

client = OpenAI(
api_key="my api key",
base_url="https://llmproxy...")
set_default_openai_client(client, True)

async with MCPServerStdio(
    name="Filesystem Server, via npx",
    params={
        "command": "npx",
        "args": ["-y", "@modelcontextprotocol/server-filesystem", samples_dir],
    },
) ...

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionQuestion about using the SDKstale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions