A Model Context Protocol (MCP) server implementation for Glean's search and chat capabilities. This server provides a standardized interface for AI models to interact with Glean's content search and conversational AI features through stdio communication.
- 🔍 Search Integration: Access Glean's powerful content search capabilities
- 💬 Chat Interface: Interact with Glean's AI assistant
- 🔄 MCP Compliant: Implements the Model Context Protocol specification
- Node.js v18 or higher
- Glean API credentials
With npm
:
npm install @gleanwork/mcp-server
With pnpm
:
pnpm install @gleanwork/mcp-server
With yarn
:
yarn add @gleanwork/mcp-server
- Set up your Glean API credentials:
export GLEAN_SUBDOMAIN=your_subdomain
export GLEAN_API_TOKEN=your_api_token
- (Optional) For global tokens that support impersonation:
export GLEAN_ACT_AS=user@example.com
To configure this MCP server in your MCP client (such as Claude Desktop, Windsurf, Cursor, etc.), add the following configuration to your MCP client settings:
{
"mcpServers": {
"glean": {
"command": "npx",
"args": ["-y", "@gleanwork/mcp-server"],
"env": {
"GLEAN_SUBDOMAIN": "<glean instance subdomain>",
"GLEAN_API_TOKEN": "<glean api token>"
}
}
}
}
Replace the environment variable values with your actual Glean credentials.
Search Glean's content index using the Glean Search API. This tool allows you to query Glean's content index with various filtering and configuration options.
For complete parameter details, see Search API Documentation
Interact with Glean's AI assistant using the Glean Chat API. This tool allows you to have conversational interactions with Glean's AI, including support for message history, citations, and various configuration options.
For complete parameter details, see Chat API Documentation
The server communicates via stdio, making it ideal for integration with AI models and other tools:
node build/index.js
You can also run the server using Docker:
docker run -e GLEAN_SUBDOMAIN=your_subdomain -e GLEAN_API_TOKEN=your_api_token ghcr.io/aaronsb/glean-mcp-server
The repository includes scripts to build multi-architecture Docker images (AMD64 and ARM64):
# Build multi-architecture image locally
./scripts/build.sh
# Build with custom image name and tag
./scripts/build.sh --image-name=myorg/glean-mcp --tag=v1.0.0
# Build and push to registry
./scripts/build.sh --push
# Build for specific platforms
./scripts/build.sh --platforms=linux/amd64,linux/arm64
For a more developer-friendly experience with better error handling and output management:
# Build for local development with improved output handling
./scripts/build-local.sh
# Enable verbose mode to see all output
./scripts/build-local.sh --verbose
# Build with custom image name and tag
./scripts/build-local.sh --image-name=myorg/glean-mcp --tag=v1.0.0
# Build and push to registry
./scripts/build-local.sh --push
# Build for specific platforms
./scripts/build-local.sh --platforms=linux/amd64,linux/arm64
The build-local.sh
script provides:
- Redirected output to log files to avoid overwhelming the console
- Clear status indicators with colored success/failure markers
- Extracted and focused error messages for TypeScript compiler issues
- Log file size warnings and viewing tips for large outputs
- Verbose mode option for detailed debugging
The Docker image is built for both AMD64 and ARM64 architectures by default, making it compatible with a wide range of systems including Apple Silicon Macs and standard x86 servers.
The server can also be run in inspect mode, which provides additional debugging information:
pnpm inspector
This will run MCP's inspector, which allows you to execute and debug calls to the server.
Please see CONTRIBUTING.md for development setup and guidelines.
MIT License - see the LICENSE file for details
- Documentation: docs.glean.com
- Issues: GitHub Issues
- Email: support@glean.com