MCP Chat is a command-line application that enables interactive chat with models via the OpenAI API. It supports document retrieval, command-based prompts, and extensible tools via the MCP (Model Control Protocol) architecture.
- Python 3.10+
- OpenAI API Key
- Create or edit the
.envfile in the project root and ensure the following variables are defined:
OPENAI_API_KEY="" # Your OpenAI key
OPENAI_MODEL="gpt-4o-mini" # Optional: default model
uv is a fast Python package installer and resolver.
- Install uv if you don't have it:
pip install uv- Create and activate a virtual environment:
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate- Install dependencies:
uv pip install -e .- Run the project
uv run main.py- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate- Install dependencies:
pip install openai python-dotenv prompt-toolkit "mcp[cli]==1.8.0"- Run the project
python main.pyType your message and press Enter to chat with the model.
Use the @ symbol followed by the document ID to include its content in your query:
> Tell me about @deposition.md
Use the / prefix to execute commands defined in the MCP server:
> /summarize deposition.md
Commands support tab-completion.
Edit mcp_server.py to add documents to the docs dictionary.
To fully implement MCP:
- Complete the TODOs in
mcp_server.py - Implement the missing functionality in
mcp_client.py
No linting or type checking is implemented.