MCP server for xAI's Grok API, can do image understanding, image generation, live web search, and reasoning models.
- Multiple Grok Models: Access to Grok-4.1-Fast-Reasoning, Grok-4.1-Fast-Non-Reasoning, Grok-4-Fast, Grok-3-Mini, and more
- Image Generation: Create images using Grok's image generation model
- Vision Capabilities: Analyze images with Grok's vision models
- Live Web Search: Real time web search with citations from news, web, X, and RSS feeds
- Reasoning Models: Advanced reasoning with extended thinking models (Grok-4.1-Fast-Reasoning, Grok-3-Mini, Grok-4)
- Stateful Conversations: Use this newly released feature to maintain conversation context as id across multiple requests
- Conversation History: Set it on or off to use prior context
- Python 3.11 or higher
- xAI API key (Get one here)
- Astral UV
- Clone the repository:
git clone https://github.com/merterbak/Grok-MCP.git
cd Grok-MCP- Create a venv environment:
uv venv
source .venv/bin/activate # macOS/Linux or .venv\Scripts\activate on Windows- Install dependencies:
uv syncAdd this to your Claude Desktop configuration file:
{
"mcpServers": {
"grok": {
"command": "uv",
"args": [
"--directory",
"/path/to/Grok-MCP",
"run",
"python",
"main.py"
],
"env": {
"XAI_API_KEY": "your_api_key_here"
}
}
}
}Claude Desktop can't send uploaded images in the chat to an MCP tool. The easiest way to give access to files directly from your computer is official Filesystem MCP server. After setting it up you’ll be able to just write the image’s file path (such as /Users/mert/Desktop/image.png) in chat and Claude can use it with any vision chat tool.
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/Users/<your-username>/Desktop",
"/Users/<your-username>/Downloads"
]
}
}
}
For stdio:
uv run python main.pyDocker:
docker compose up --buildMcp Inspector:
mcp dev main.pyList all available Grok models with creation dates and ownership information.
Standard chat completion with extensive customization options.
Parameters:
prompt(required): Your messagemodel: Model to use (default: "grok-4-1-fast-non-reasoning")system_prompt: Optional system instructionuse_conversation_history: Enable multi-turn conversationstemperature,max_tokens,top_p: Generation parameterspresence_penalty,frequency_penalty,stop: Advanced controlreasoning_effort: For reasoning models ("low" or "high")
Get detailed reasoning along with the response.
Parameters:
prompt(required): Your question or taskmodel: "grok-4", "grok-3-mini", "grok-3-mini-fast", or "grok-4-1-fast-reasoning" (default: "grok-4-1-fast-reasoning")reasoning_effort: "low" or "high" (not for grok-4)system_prompt,temperature,max_tokens,top_p
Returns: Content, reasoning content, and usage statistics
Analyze images with natural language queries.
Parameters:
prompt(required): Your question about the image(s)image_paths: List of local image file pathsimage_urls: List of image URLsdetail: "auto", "low", or "high"model: Vision-capable model (default: "grok-4-1-fast-non-reasoning")
Supported formats: JPG, JPEG, PNG
Create images from text descriptions.
Parameters:
prompt(required): Image descriptionn: Number of images to generate (default: 1)response_format: "url" or "b64_json"model: Image generation model (default: "grok-2-image-1212")
Returns: Generated images and revised prompt
Search the web in real-time with source citations.
Parameters:
prompt(required): Your search querymodel: Model to use (default: "grok-4-1-fast-non-reasoning")mode: "on" or "off"return_citations: Include source citations (default: true)from_date,to_date: Date range (YYYY-MM-DD)max_search_results: Max results to fetch (default: 20)country: Country code for localized searchrss_links: List of RSS feed URLs to searchsources: Custom source configuration
Returns: Content, citations, usage stats, and number of sources used
Maintain conversation state across multiple requests on xAI servers.
Parameters:
prompt(required): Your messageresponse_id: Previous response ID to continue conversationmodel: Model to use (default: "grok-4-1-fast-non-reasoning")system_prompt: System instruction (only for new conversations)include_reasoning: Include reasoning summarytemperature,max_tokens
Returns: Response with ID for continuing the conversation (stored for 30 days)
Retrieve a previously stored conversation response.
Parameters:
response_id(required): The response ID to retrieve
Delete a stored conversation from xAI servers.
Parameters:
response_id(required): The response ID to delete
This project is open source and available under the MIT License.