A Model Context Protocol (MCP) server that provides AI-powered code context retrieval using semantic search. This server integrates with GitHub Copilot and other AI coding assistants to provide relevant code snippets from your codebase.
- Semantic Code Search: Uses OpenAI embeddings to find semantically similar code chunks
- Vector Database: Stores code embeddings in Supabase for fast retrieval
- MCP Integration: Follows the Model Context Protocol for seamless integration with AI tools
- Datadog Monitoring: Comprehensive observability with APM tracing, custom metrics, and error tracking
- Two Tools:
get_code_context: Retrieves relevant code context from the codebaseaugment_prompt: Creates a complete prompt with context for LLM consumption
- Python 3.10 or higher
- Supabase account with vector database setup
- OpenAI API key
- (Optional) Datadog account for monitoring and observability
- Clone the repository and navigate to the Server directory:
cd /Users/devabhi/Projects/Peyote/Server- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate- Install dependencies:
pip install -r requirements.txt- Create a
.envfile with your credentials:
SUPABASE_URL=your_supabase_url
SUPABASE_SERVICE_KEY=your_supabase_service_key
OPENAI_API_KEY=your_openai_api_key
# Optional: Datadog Monitoring
DD_API_KEY=your_datadog_api_key
DD_APP_KEY=your_datadog_app_key
DD_SERVICE_NAME=peyote-ingest
DD_ENV=productionThe server uses stdio for communication following the MCP protocol:
python src/mcp_server.py- Open Cursor Settings (Cmd+Shift+J or Ctrl+Shift+J)
- Navigate to "Cursor Settings" > "Model Context Protocol"
- Add a new MCP server configuration by editing the configuration file (usually at
~/.cursor/mcp_settings.jsonor through the UI):
{
"mcpServers": {
"peyote-code-context": {
"command": "python",
"args": ["/Users/devabhi/Projects/Peyote/Server/src/mcp_server.py"],
"env": {
"SUPABASE_URL": "your_supabase_url",
"SUPABASE_SERVICE_KEY": "your_supabase_service_key",
"OPENAI_API_KEY": "your_openai_api_key"
}
}
}
}Or use a .env file by setting the working directory:
{
"mcpServers": {
"peyote-code-context": {
"command": "python",
"args": ["/Users/devabhi/Projects/Peyote/Server/src/mcp_server.py"],
"cwd": "/Users/devabhi/Projects/Peyote/Server"
}
}
}- Restart Cursor to load the MCP server
- Install the latest version of VS Code with GitHub Copilot extension
- Configure the MCP server in your VS Code settings (
.vscode/settings.jsonor user settings):
{
"github.copilot.mcp.servers": {
"peyote-code-context": {
"command": "python",
"args": ["/Users/devabhi/Projects/Peyote/Server/src/mcp_server.py"],
"env": {
"SUPABASE_URL": "your_supabase_url",
"SUPABASE_SERVICE_KEY": "your_supabase_service_key",
"OPENAI_API_KEY": "your_openai_api_key"
}
}
}
}Or use a .env file and reference it:
{
"github.copilot.mcp.servers": {
"peyote-code-context": {
"command": "python",
"args": ["/Users/devabhi/Projects/Peyote/Server/src/mcp_server.py"],
"cwd": "/Users/devabhi/Projects/Peyote/Server"
}
}
}- Restart VS Code to load the MCP server
Once configured, you can use the tools in Cursor's AI chat:
Use @peyote-code-context to get code context for: "def calculate_similarity(a, b):"
Or invoke the tools directly through the MCP interface.
Once configured in VS Code, you can use the tools in Copilot Chat:
@peyote-code-context get_code_context with code_snippet: "def calculate_similarity(a, b):"
Or:
@peyote-code-context augment_prompt with code_snippet: "class UserRepository:"
Retrieves relevant code context from the codebase based on a code snippet.
Parameters:
code_snippet(string, required): The code snippet to find context for
Example:
{
"code_snippet": "def process_payment(amount, user_id):"
}Returns: A formatted markdown document with relevant code chunks, including file paths and similarity scores.
Augments a code completion prompt with relevant context from the codebase.
Parameters:
code_snippet(string, required): The code snippet to augment with context
Example:
{
"code_snippet": "class PaymentProcessor:"
}Returns: A complete prompt that can be used with an LLM, including the original code and retrieved context.
Before using the MCP server, you need to ingest your codebase into the vector database:
python src/ingest.pyThis will:
- Parse your codebase
- Split it into meaningful chunks
- Generate embeddings using OpenAI
- Store the embeddings in Supabase
User Code (VS Code)
↓
GitHub Copilot
↓
MCP Server (mcp_server.py)
↓
OpenAI Embeddings API
↓
Supabase Vector Database
↓
Retrieved Context
↓
GitHub Copilot (with context)
The ingestion service (ingest.py) includes comprehensive Datadog integration for monitoring and observability:
- Custom Metrics: Track ingestion performance, error rates, and resource usage
- Error Tracking: Automatic exception capture with stack traces
- Structured Logging: Centralized logging with correlation to traces
- Add your Datadog credentials to
.env:
DD_API_KEY=your_datadog_api_key
DD_APP_KEY=your_datadog_app_key
DD_SERVICE_NAME=peyote-ingest
DD_ENV=production- Run the service:
python src/ingest.py- View metrics in Datadog dashboard
- If MCP server doesn't appear in Cursor, check the configuration file location (
~/.cursor/mcp_settings.jsonon macOS/Linux,%APPDATA%\Cursor\mcp_settings.jsonon Windows) - Restart Cursor completely after making configuration changes
- Check Cursor's developer console (Help > Toggle Developer Tools) for MCP-related errors
- Ensure you're using a recent version of Cursor that supports MCP
- Ensure Python path is correct in the configuration
- Check that all environment variables are set
- Verify the virtual environment is activated if using one
- Make sure you've run the ingestion script first
- Verify your Supabase database has the
match_code_chunksfunction - Check OpenAI API key is valid
- Ensure the Python script has execute permissions
- Verify the paths in your configuration are absolute paths