MCP Server - DeepL is a production-ready Model Context Protocol (MCP) server that provides seamless integration with Deepl's industry-leading neural machine translation API. Built with enterprise-grade architecture and strict type safety, this server enables AI assistants and agents to translate text and documents across 30+ languages, detect languages, manage custom glossaries for consistent terminology, and leverage DeepL's advanced features like formality control and context-aware translations—all through a type-safe, async-first interface powered by FastMCP.
- Official Client: Built on the official
deepl-pythonlibrary for reliability - Full API Coverage: Complete implementation of DeepL API (translation, glossaries, usage tracking)
- Strongly Typed: All responses use Pydantic models for type safety
- Dual Transport: Supports both stdio and HTTP (streamable-http) modes
- Async/Await: Async wrapper for seamless MCP integration
- Type Safe: Full mypy strict mode compliance
- Production Ready: Docker support, comprehensive tests, CI/CD pipeline
- Developer Friendly: Makefile commands, auto-formatting, fast feedback
- High-Quality Translation: Superior to Google Translate in quality
- 30+ Languages: European and Asian languages
- Document Translation: PDF, DOCX, PPTX, XLSX, HTML, TXT
- Custom Glossaries: Consistent terminology across translations
- Formality Control: Formal/informal tone for supported languages
This server follows MCP Server Architecture:
src/mcp_deepl/
├── __init__.py # Package initialization
├── server.py # FastMCP server with tool definitions
├── api_client.py # Async wrapper around official DeepL Python client
└── api_models.py # Pydantic models for type safety
tests/ # Unit tests with pytest + AsyncMock
e2e/ # End-to-end Docker integration tests
Key Implementation Details:
- Uses the official
deepl-pythonlibrary for reliable API communication - Wraps the official client with async methods for MCP compatibility
- Maintains full type safety with Pydantic models
- Supports all DeepL API features including glossaries and document translation
# Install package
uv pip install -e .
# Install with dev dependencies
uv pip install -e . --group devpip install -e .- Copy the example environment file:
cp .env.example .env- Edit
.envand add your DeepL API key:
DEEPL_API_KEY=your_api_key_hereHow to get credentials:
- Go to deepl.com/pro-api
- Sign up for an account (Free or Pro)
- Go to Account Settings
- Find your API key under "Authentication Key for DeepL API"
- Copy the key and store as
DEEPL_API_KEY
API Key Format:
- Free tier keys end with
:fx(e.g.,abc123:fx) - Pro keys do not have the
:fxsuffix - The server automatically detects which endpoint to use
The .env file is automatically loaded when the server starts.
make run-stdio
# or
uv run fastmcp run src/mcp_deepl/server.pymake run-http
# or
uv run uvicorn mcp_deepl.server:app --host 0.0.0.0 --port 8000
# Test the server is running
make test-http# Build image locally
make docker-build
# Build and push multi-platform image (amd64 + arm64)
make release VERSION=1.0.0
# Run container
make docker-runAdd to your Claude Desktop config file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
First, start the HTTP server:
make run-httpThen add this to your Claude Desktop config:
{
"mcpServers": {
"deepl": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:8000/mcp"
]
}
}
}Benefits: Better performance, easier debugging, can be deployed remotely
{
"mcpServers": {
"deepl": {
"command": "uv",
"args": [
"--directory",
"/absolute/path/to/mcp-deepl",
"run",
"fastmcp",
"run",
"src/mcp_deepl/server.py"
]
}
}
}translate_text(text, target_lang, ...)- Translate text between languagestranslate_with_glossary(text, target_lang, glossary_id, ...)- Translate using custom glossary
detect_language(text)- Detect the language of textlist_languages(language_type)- List all supported languages
get_usage()- Get API usage statistics
list_glossaries()- List custom glossariescreate_glossary(name, source_lang, target_lang, entries)- Create custom glossaryget_glossary(glossary_id)- Get glossary detailsdelete_glossary(glossary_id)- Delete a glossary
translate_document(document_path, target_lang, ...)- Translate entire documentsget_document_status(document_id, document_key)- Check translation statusdownload_translated_document(document_id, document_key)- Download translated document
make help # Show all available commands
make install # Install dependencies
make dev-install # Install with dev dependencies
make format # Format code with ruff
make lint # Lint code with ruff
make typecheck # Type check with mypy
make test # Run tests with pytest
make test-cov # Run tests with coverage
make test-e2e # Run E2E Docker tests (requires Docker)
make test-http # Test HTTP server is running
make check # Run all checks (lint + typecheck + test)
make clean # Clean up artifacts
make all # Full workflow (clean + install + format + check)# Run unit tests
make test
# Run with coverage report
make test-cov
# Run E2E Docker tests (requires Docker, not run in CI)
make test-e2e
# Run specific test file
uv run pytest tests/test_server.py -v# Format code
make format
# Lint code
make lint
# Fix linting issues automatically
make lint-fix
# Type check
make typecheck
# Run all checks
make check# Build local image
make docker-build
# Build and push multi-platform image
make release VERSION=1.0.0
# Run container
make docker-runMulti-Platform Build Setup (first time only):
# Create and use a new buildx builder
docker buildx create --name multiplatform --use
# Verify the builder
docker buildx inspect --bootstrapThe release command builds for both linux/amd64 and linux/arm64 architectures and pushes directly to your container registry.
The server exposes a health check endpoint at /health:
curl http://localhost:8000/health
# {"status":"healthy","service":"mcp-deepl"}
# Or use the Makefile command
make test-httpIf Claude Desktop can't connect to the server:
- Check server is running:
make test-http - Verify port: Ensure port 8000 is not in use by another service
- Check logs: Look at the server output for any errors
- Test MCP endpoint:
curl http://localhost:8000/should return MCP protocol info - Verify .env: Ensure
DEEPL_API_KEYis set in your.envfile
To use a different port (e.g., 9000):
uv run uvicorn mcp_deepl.server:app --host 0.0.0.0 --port 9000Then update your Claude Desktop config to use http://localhost:9000/mcp
Free Tier:
- 500,000 characters per month
- Suitable for testing and small projects
Pro Plans:
- Unlimited characters based on plan
- Higher priority processing
- Additional features
Monitor usage with get_usage() tool.
European Languages: Bulgarian (BG), Czech (CS), Danish (DA), German (DE), Greek (EL), English (EN), Spanish (ES), Estonian (ET), Finnish (FI), French (FR), Hungarian (HU), Indonesian (ID), Italian (IT), Lithuanian (LT), Latvian (LV), Dutch (NL), Polish (PL), Portuguese (PT), Romanian (RO), Russian (RU), Slovak (SK), Slovenian (SL), Swedish (SV), Turkish (TR), Ukrainian (UK)
Asian Languages: Chinese (ZH), Japanese (JA), Korean (KO)
# Translate text
result = await translate_text(
text="Hello, world!",
target_lang="DE"
)
# With formality control
result = await translate_text(
text="How are you?",
target_lang="DE",
formality="more" # Formal German
)# Create glossary for consistent terminology
glossary = await create_glossary(
name="Product Terms",
source_lang="EN",
target_lang="DE",
entries={
"smartphone": "Smartphone",
"tablet": "Tablet-PC",
"app": "App"
}
)
# Translate using glossary
result = await translate_with_glossary(
text="Our new smartphone app",
target_lang="DE",
glossary_id=glossary.glossary_id
)# Detect language
result = await detect_language(text="Bonjour le monde")
# Returns: {"detected_language": "FR", ...}- Python 3.13+
- deepl (official DeepL Python client)
- fastapi
- fastmcp
- pydantic
- python-dotenv
- uvicorn
See CONTRIBUTING.md for development guidelines.
MIT
Part of the NimbleTools Registry - an open source collection of production-ready MCP servers. For enterprise deployment, check out NimbleBrain.