Deepl Fastmcp Python Server
A Python-based Model Context Protocol server that provides translation capabilities using Python, FastMCP, and DeepL API.(Basic MCP)
Install / Use
/learn @AlwaysSany/Deepl Fastmcp Python ServerQuality Score
Category
Development & EngineeringSupported Platforms
README
DeepL MCP Server
A Model Context Protocol (MCP) server that provides translation capabilities using the DeepL API using python and fastmcp.
Working Demo
<video src="https://private-user-images.githubusercontent.com/3911298/452408725-04acb3c8-f37b-43a9-8b6f-249843a052ed.webm?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDkyMzI2NzYsIm5iZiI6MTc0OTIzMjM3NiwicGF0aCI6Ii8zOTExMjk4LzQ1MjQwODcyNS0wNGFjYjNjOC1mMzdiLTQzYTktOGI2Zi0yNDk4NDNhMDUyZWQud2VibT9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA2MDYlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwNjA2VDE3NTI1NlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWM5NTJiMjhjMWVlODM0ZDVlMzMyNzgzNGE5NmRhZTI0YjQ5OGI5NzUzMWFkZTkxNzU0MDJkNDRmZWMwYTk1Y2ImWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.Kp9OyvzESVW_ml5tQhg1U5Fh_rFar78HDv0uXPaVAkU" controls width="100%"></video>
Features
- Translate text between numerous languages
- Rephrase text using DeepL's capabilities
- Access to all DeepL API languages and features
- Automatic language detection
- Formality control for supported languages
- Batch translation and document translation
- Usage and quota reporting
- Translation history and usage analysis
- Support for multiple MCP transports: stdio, SSE, and Streamable HTTP
Installation
Standard (Local) Installation
-
Clone the repository:
git clone https://github.com/AlwaysSany/deepl-fastmcp-python-server.git cd deepl-fastmcp-python-server -
Install uv (recommended) or use pip:
With pip,
pip install uvWith pipx,
pipx install uv -
Install dependencies:
uv sync -
Set your environment variables:
Create a
.envfile or exportDEEPL_AUTH_KEYin your shell.You can do this by running the following command and then update the.envfile with your DeepL API key:cp .env.example .envExample
.envfile,DEEPL_AUTH_KEY=your_deepl_api_key -
Run the server:
Normal mode:
uv run python main.py --transport stdioTo run with Streamable HTTP transport (recommended for web deployments):
uv run python main.py --transport streamable-http --host 127.0.0.1 --port 8000To run with SSE transport:
uv run python main.py --transport sse --host 127.0.0.1 --port 8000Development mode:
uv run mcp dev main.py
It will show some messages in the terminal like this:
Spawned stdio transport Connected MCP client to backing server transport
Created web app transport
Set up MCP proxy
🔍 MCP Inspector is up and running at http://127.0.0.1:6274
MCP Inspector,

Dockerized Installation
-
Build the Docker image:
docker build -t deepl-fastmcp-server . -
Run the container:
docker run -e DEEPL_AUTH_KEY=your_deepl_api_key -p 8000:8000 deepl-fastmcp-server
Docker Compose
-
Create a
.envfile in the project root:DEEPL_AUTH_KEY=your_deepl_api_key -
Start the service:
docker compose up --buildThis will build the image and start the server, mapping port 8000 on your host to the container.
Configuration
DeepL API Key
You'll need a DeepL API key to use this server. You can get one by signing up at DeepL API. With a DeepL API Free account you can translate up to 500,000 characters/month for free.
Required environment variables:
DEEPL_AUTH_KEY(required): Your DeepL API key.DEEPL_SERVER_URL(optional): Override the DeepL API endpoint (default:https://api-free.deepl.com).
MCP Transports
This server supports the following MCP transports:
- Stdio: Default transport for local usage.
- SSE (Server-Sent Events): Ideal for real-time event-based communication.
- Streamable HTTP: Suitable for HTTP-based streaming applications.
To configure these transports, ensure your environment supports the required protocols and dependencies.
Usage
Use with Cursor IDE,
Click on File > Preferences > Cursor Settings > MCP > MCP Servers > Add new global MCP server
and paste the following json:
{
"mcpServers": {
"deepl-fastmcp": {
"command": "uv",
"args": [
"--directory",
"/path/to/yourdeepl-fastmcp-python-server/.venv",
"run",
"--with",
"mcp",
"python",
"/path/to/your/deepl-fastmcp-python-server/main.py",
"--transport",
"streamable-http",
"--host",
"127.0.0.1",
"--port",
"8000"
]
}
}
}
Note: To use Streamable HTTP or SSE transports with Cursor IDE, change the "--transport", "stdio" line to "--transport", "streamable-http", "--host", "127.0.0.1", "--port", "8000" or "--transport", "sse", "--host", "127.0.0.1", "--port", "8000" respectively, and adjust the host and port as needed.
For example,
"mcpServers": {
"deepl-fastmcp": {
"type": "sse",
"url": "http://127.0.0.1:8000/sse"
}
}
and then run mcp server from terminal uv run main.py --transport sse --host 127.0.0.1 --port 8000
Cursor Settings,

Use with Claude Desktop
This MCP server integrates with Claude Desktop to provide translation capabilities directly in your conversations with Claude.
Configuration Steps
-
Install Claude Desktop if you haven't already
-
Create or edit the Claude Desktop configuration file:
- On macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - On Windows:
%AppData%\Claude\claude_desktop_config.json - On Linux:
~/.config/Claude/claude_desktop_config.json
- On macOS:
-
Add the DeepL MCP server configuration:
{
"mcpServers": {
"deepl-fastmcp": {
"command": "uv",
"args": [
"--directory",
"/path/to/yourdeepl-fastmcp-python-server/.venv",
"run",
"--with",
"mcp",
"python",
"/path/to/your/deepl-fastmcp-python-server/main.py",
"--transport",
"streamable-http",
"--host",
"127.0.0.1",
"--port",
"8000"
]
}
}
}
Note: To use Streamable HTTP or SSE transports with Claude Desktop, change the "--transport", "stdio" line to "--transport", "streamable-http", "--host", "127.0.0.1", "--port", "8000" or "--transport", "sse", "--host", "127.0.0.1", "--port", "8000" respectively, and adjust the host and port as needed.
Available Tools
This server provides the following tools:
translate_text: Translate text to a target languagerephrase_text: Rephrase text in the same or different languagebatch_translate: Translate multiple texts in a single requesttranslate_document: Translate a document file using DeepL APIdetect_language: Detect the language of given textget_translation_history: Get recent translation operation historyanalyze_usage_patterns: Analyze translation usage patterns from history
Available Resources
The following resources are available for read-only data access (can be loaded into LLM context):
usage://deepl: DeepL API usage info.deepl://languages/source: Supported source languages.deepl://languages/target: Supported target languages.deepl://glossaries: Supported glossary language pairs.history://translations: Recent translation operation history (same asget_translation_historytool)usage://patterns: Usage pattern analysis (same asanalyze_usage_patternstool)
Available Prompts
The following prompt is available for LLMs:
-
summarize: Returns a message instructing the LLM to summarize a given text.Example usage:
@mcp.prompt("summarize") def summarize_prompt(text: str) -> str: return f"Please summarize the following text:\n\n{text}"
Tool Details
<details> <summary>🖼️ Click to see the tool details</summary>translate_text
Translate text between languages using the DeepL API.
- Parameters:
text: The text to translatetarget_language: Target language code (e.g., 'EN', 'DE', 'FR', 'ES', 'IT', 'JA', 'ZH')source_language(optional): Source language codeformality(optional): Controls formality level ('less', 'more', 'default', 'prefer_less', 'prefer_more')preserve_formatting(optional): Whether to preserve formattingsplit_sentences(optional): How to split sentencestag_handling(optional): How to handle tags
rephrase_text
Rephrase text in the same or different language using the DeepL API.
- Parameters:
text: The text to rephrasetarget_language: Language code for rephrasingformality(optional): Desired formality levelcontext(optional): Additional context for better rephrasing
batch_translate
Translate multiple texts in a single request.
- Parameters:
texts: List of texts to translatetarget_language: Target language codesource_language(optional): Source language codeformality(optional): Formality levelpreserve_formatting(optional): Whether to preserve formatting
translate_document
Translate a document file using DeepL API.
- Parameters:
file_path: Path to the document filetarget_language: Target language codeoutput_path(optional): Output path for translated documentformality(optional): Formality level- `preserve_forma
