A remote HTTP MCP server that provides tools to search and retrieve Google Gemini API documentation. The server exposes the MCP protocol at the /mcp endpoint and can be deployed to Cloud Run or other containerized platforms. It also supports local stdio mode for development.
- Search Documentation: Full-text search across all Gemini documentation pages.
- Get Capabilities: List available documentation pages or retrieve content for a specific page.
- Get Current Model: Quickly access documentation for current Gemini models.
- Automatic Updates: Scrapes and updates documentation on server startup.
sequenceDiagram
participant Client as MCP Client / IDE
participant Server as FastMCP Server
participant DB as SQLite Database
Client->>Server: call_tool("search_documentation", queries=["embeddings"])
Server->>DB: Full-Text Search for "embeddings"
DB-->>Server: Return matching documentation
Server-->>Client: Return formatted results
- Ingestion: On startup, the server fetches
https://ai.google.dev/gemini-api/docs/llms.txtto get a list of all available documentation pages. - Processing: It then concurrently fetches and processes each page, extracting the text content.
- Indexing: The processed content is stored in a local SQLite database with a Full-Text Search (FTS5) index for efficient querying.
- Searching: When you use the
search_documentationtool, the server queries this SQLite database to find the most relevant documentation pages.
You can use uvx to run the server directly without explicit installation. This is the easiest way to get started.
uvx --from git+https://github.com/philschmid/gemini-api-docs-mcp gemini-docs-mcpYou can install the package directly from GitHub using pip:
pip install git+https://github.com/philschmid/gemini-api-docs-mcp.gitgit clone https://github.com/philschmid/gemini-api-docs-mcp.git
cd gemini-api-docs-mcp
pip install -e .
cd ..
rm -rf gemini-api-docs-mcpThe server runs as an HTTP server and exposes the MCP protocol at the /mcp endpoint. It respects the PORT environment variable (defaults to 8080).
# Set port (optional, defaults to 8080)
export PORT=8080
# Run the server
gemini-docs-mcpThe server will be accessible at http://localhost:8080/mcp (or your configured port).
Build and run the Docker container:
# Build the image
docker build -t gemini-docs-mcp .
# Run the container
docker run -p 8080:8080 gemini-docs-mcpDeploy to Google Cloud Run:
# Build and deploy
gcloud run deploy gemini-docs-mcp \
--source . \
--platform managed \
--region us-central1 \
--allow-unauthenticatedThe server will be accessible at https://<your-service-url>/mcp.
If you don't set the PORT environment variable, the server runs in stdio mode for local MCP clients:
# Don't set PORT - runs in stdio mode
gemini-docs-mcpThe database is stored at:
/tmp/gemini-api-docs/database.dbin containerized environments~/.mcp/gemini-api-docs/database.dbin local environments
You can override this by setting the GEMINI_DOCS_DB_PATH environment variable.
For remote HTTP servers, configure your MCP client to connect via HTTP:
{
"mcpServers": {
"gemini-docs": {
"url": "https://<your-service-url>/mcp"
}
}
}For local development with stdio (if supported by your client):
{
"mcpServers": {
"gemini-docs": {
"command": "gemini-docs-mcp"
}
}
}search_documentation(queries: list[str]): Performs a full-text search on Gemini documentation for the given list of queries (max 3).get_capability_page(capability: str = None): Get a list of capabilities or content for a specific one.get_current_model(): Get documentation for current Gemini models.
MIT
We run a comprehensive evaluation harness to ensure the MCP server provides accurate and up-to-date code examples. The tests cover both Python and TypeScript SDKs.
| Metric | Value |
|---|---|
| Total Tests | 117 |
| Passed | 114 |
| Failed | 3 |
Last updated: 2025-11-03 13:29:01
You can find the detailed test results in tests/result.json.