Skip to content

An adapter between Anthropic's Claude Code CLI and OpenAI’s GPT‑5, AND A LITELLM SERVER BOILERPLATE for creating your own AI assistants with LibreChat as the UI (a dual-purpose repo).

License

Notifications You must be signed in to change notification settings

teremterem/claude-code-gpt-5

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Claude Code with GPT-5

This repository lets you use Anthropic's Claude Code CLI with OpenAI's GPT-5 via a local LiteLLM proxy.

⚠️ ATTENTION ⚠️

If you're here to set up your own LiteLLM Server (potentially with LibreChat or similar UI), head over to the main-boilerplate branch. It contains a "boilerplate" version of this repo with Claude Code CLI stuff stripped away for simplicity, and with a version of README.md that specifically explains how to build on top of this repo as a boilerplate.

Quick Start ⚡

Prerequisites

First time using GPT-5 via API?

If you are going to use GPT-5 via API for the first time, OpenAI may require you to verify your identity via Persona. You may encounter an OpenAI error asking you to “verify your organization.” To resolve this, you can go through the verification process here:

Setup 🛠️

  1. Clone this repository:

    git clone https://github.com/teremterem/claude-code-gpt-5.git
    cd claude-code-gpt-5
  2. Configure Environment Variables:

    Copy the template file to create your .env:

    cp .env.template .env

    Edit .env and add your OpenAI API key:

    OPENAI_API_KEY=your-openai-api-key-here
    # Optional: only needed if you plan to use Anthropic models
    # ANTHROPIC_API_KEY=your-anthropic-api-key-here
    
    # Optional (see .env.template for details):
    # LITELLM_MASTER_KEY=your-master-key-here
    
    # Optional: override the default remaps if you need to (the values you see
    # below are the defaults - see .env.template for more info)
    # REMAP_CLAUDE_HAIKU_TO=gpt-5-mini-reason-minimal
    # REMAP_CLAUDE_SONNET_TO=gpt-5-reason-medium
    # REMAP_CLAUDE_OPUS_TO=gpt-5-reason-high
    
    # Some more optional settings (see .env.template for details)
    ...
  3. Run the proxy:

    1. EITHER via uv (make sure to install uv first):

      OPTION 1: Use a script for uv:

      ./uv-run.sh

      OPTION 2: Run via a direct uv command:

      uv run litellm --config config.yaml
    2. OR via Docker (make sure to install Docker Desktop first):

      OPTION 3: Run Docker in the foreground:

      ./run-docker.sh

      OPTION 4: Run Docker in the background:

      ./deploy-docker.sh

      OPTION 5: Run Docker via a direct command:

      docker run -d \
         --name claude-code-gpt-5 \
         -p 4000:4000 \
         --env-file .env \
         --restart unless-stopped \
         ghcr.io/teremterem/claude-code-gpt-5:latest

      NOTE: To run with this command in the foreground instead of the background, remove the -d flag.

      To see the logs, run:

      docker logs -f claude-code-gpt-5

      To stop and remove the container, run:

      ./kill-docker.sh

      NOTE: The Docker options above will pull the latest image from GHCR and will ignore all your local files except .env. For more detailed Docker deployment instructions and more options (like building Docker image from source yourself, using Docker Compose, etc.), see docs/DOCKER_TIPS.md

Using with Claude Code 🎮

  1. Install Claude Code (if you haven't already):

    npm install -g @anthropic-ai/claude-code
  2. Connect to GPT-5 instead of Claude:

    ANTHROPIC_BASE_URL=http://localhost:4000 claude

    If you set LITELLM_MASTER_KEY for the proxy (see .env.template for details), pass it as the Anthropic API key for the CLI:

    ANTHROPIC_API_KEY="<LITELLM_MASTER_KEY>" \
    ANTHROPIC_BASE_URL=http://localhost:4000 \
    claude

    NOTE: In this case, if you've previously authenticated, run claude /logout first.

  3. That's it! Your Claude Code client will now use the selected GPT-5 variant(s) with your chosen reasoning effort level(s). 🎯

Available GPT-5 model aliases

  • GPT-5:
    • gpt-5-reason-minimal
    • gpt-5-reason-low
    • gpt-5-reason-medium
    • gpt-5-reason-high
  • GPT-5-mini:
    • gpt-5-mini-reason-minimal
    • gpt-5-mini-reason-low
    • gpt-5-mini-reason-medium
    • gpt-5-mini-reason-high
  • GPT-5-nano:
    • gpt-5-nano-reason-minimal
    • gpt-5-nano-reason-low
    • gpt-5-nano-reason-medium
    • gpt-5-nano-reason-high

NOTE: Generally, you can use arbitrary models from arbitrary providers, but for providers other than OpenAI or Anthropic, you will need to specify the provider in the model name, e.g. gemini/gemini-pro, gemini/gemini-pro-reason-disable etc. (as well as set the respective API keys along with any other environment variables that the provider might require in your .env file).

KNOWN PROBLEM

The Web Search tool currently does not work with this setup. You may see an error like:

API Error (500 {"error":{"message":"Error calling litellm.acompletion for non-Anthropic model: litellm.BadRequestError: OpenAIException - Invalid schema for function 'web_search': 'web_search_20250305' is not valid under any of the given schemas.","type":"None","param":"None","code":"500"}}) · Retrying in 1 seconds… (attempt 1/10)

This is planned to be fixed soon.

NOTE: The Fetch tool (getting web content from specific URLs) is not affected and works normally.

P. S. You are welcome to join our MiniAgents Discord Server 👥

And if you like the project, please give it a Star 💫

Star History Chart

About

An adapter between Anthropic's Claude Code CLI and OpenAI’s GPT‑5, AND A LITELLM SERVER BOILERPLATE for creating your own AI assistants with LibreChat as the UI (a dual-purpose repo).

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 3

  •  
  •  
  •