Multranslate
A TUI for translating text in multiple translators simultaneously as well as OpenAI and local LLM, with support for translation history and automatic language detection.
Install / Use
/learn @Lifailon/MultranslateREADME
Cross-platform terminal user interface based on the Blessed library for simultaneous text translation using several popular translation sources and LLM. All sources do not require an API access token (with the exception of official OpenAI or OpenRouter). Supports automatic source and target language definition at code level between English and any of the supported languages, as well as access to translation history via SQLite (up to 500 requests, after which old records from the history are automatically cleared).

Translation providers
- Google - free and unlimited API using serverless hosted on the Vercel platform. Available for translation more than 5000 characters.
- DeepL - free
APIvia DeepLX using serverless hosted on Vercel platform. There are limits on the number of translation requests that can be made frequently, and there may also be a limit on the number of characters that can be used (the official limit is 5000 characters per request). - Reverso - the most stable, free and without any limitation on the number of characters (version on the site is limited to 2000 characters and 900 in the application, through the
APIcan get up to 8000). Does not contain official documentation, request was received from official site via DevTools. - MyMemory - free and open
API(limit of 500 characters per request). Supports up to 3 response options for short queries. - LLM - using large language models with a pre-installed system
promptfor text translation or in chat mode with support for streaming responses. -
- OpenAI is the official provider of the
ChatGPTmodel. To use, you must pass theAPIkey via the--keyparameter (has a higher priority) or use theOPENAI_API_KEYenvironment variable (similarly forOpenRouter).
- OpenAI is the official provider of the
-
- OpenRouter is a universal provider that provides unified access to different models. Supports free models (e.g. DeepSeek R1), allowing you to use it without replenishing your account immediately after registration. To use, you need to pass the url and the
APIkey via parameters or environment variables, similar toOpenAI.
- OpenRouter is a universal provider that provides unified access to different models. Supports free models (e.g. DeepSeek R1), allowing you to use it without replenishing your account immediately after registration. To use, you need to pass the url and the
-
- LM Studio is an interface for running and using local models in offline mode (the
APIrequest and response scheme corresponds toOpenAI). It is recommended to choose a model pre-trained in the desired language (for example, using thetranslationfilter on Hugging Face).
- LM Studio is an interface for running and using local models in offline mode (the
Install
Use the NPM package manager to install a stable version:
npm install -g multranslate
Or install from the GitHub repository:
npm install -g https://github.com/Lifailon/multranslate
Run the application:
multranslate
Get help:
multranslate --help
Usage: multranslate [options]
Cross-platform TUI for translating text in multiple translators simultaneously and LLM, with support for
translation history and automatic language detection.
Options:
-V, --version output the version number
-l, --language <name> Select the language: ru, ja, zh, ko, ar, tr, uk, sk, pl, de, fr, it, es, el, hu, nl,
sv, ro, cs, da, pt, vi (default: "ru" or the environment "TRANSLATE_LANGUAGE")
-t, --translator <name> Select the translator: all, Google, DeepL, Reverso, MyMemory, OpenAI (default: "all")
-k, --key <value> API key parameter for OpenAI (high priority) or using the environment "OPENAI_API_KEY"
-u, --urlOpenai <url> Url address for OpenAI, OpenRouter or local LLM API (default: "https://api.openai.com"
or the environment "OPENAI_URL")
-m, --model <name> Select the LLM model (default: "gpt-4o-mini" or the environment "OPENAI_MODEL")
-e, --temp <number> Select the temperature for LLM (default: "0.7" or the environment "OPENAI_TEMP")
-h, --help display help for command
To use OpenAI, you need to pass parameters to connect to API (has high priority) or use environment variables (recommended).
OpenAI
Using environment variables in Linux:
export OPENAI_API_KEY="sk-proj-..."
multranslate
You can save the environment variable for later use after reconnecting to the current terminal session.
echo 'export OPENAI_API_KEY="sk-proj-..."' >> ~/.bashrc
source ~/.bashrc
multranslate
It is recommended to make changes to the profile file through any text editor, for example, nano, so that the key content is not saved in the command history.
OpenRouter
Using the DeepSeek R1 free model via parameters:
multranslate -u "https://openrouter.ai/api" -m "deepseek/deepseek-r1:free" -k "sk-or-v1-..."
Note that the default /v1/chat/completions append path is used for all requests.
LM Studio
Using environment variables in Windows:
$env:OPENAI_URL = "http://127.0.0.1:1234"
$env:OPENAI_MODEL = "llama-3-8b-gpt-4o-ru1.0"
multranslate
Save variables in the current user's environment via PowerShell for later use:
[System.Environment]::SetEnvironmentVariable("OPENAI_API_KEY", "sk-or-v1-...", "User")
[System.Environment]::SetEnvironmentVariable("OPENAI_URL", "https://openrouter.ai/api", "User")
[System.Environment]::SetEnvironmentVariable("OPENAI_MODEL", "deepseek/deepseek-r1:free", "User")
To apply, restart the terminal.
Build
Clone the repository:
git clone https://github.com/Lifailon/multranslate
cd multranslate
Install dependencies and run the application:
npm install
npm start
Docker
Configure the env environment variable file for language selection and connection to the LLM.
Build the image and run a temporary container (volume is used to store history between runs):
docker build -t multranslate .
docker run --env-file .env -it --rm -v multranslate:/multranslate multranslate
Supported languages
You can change the language for automatic definition between English and any of those presented in the table below:
| Parameter | Language | | - | - | | ru | Russian (default) | | ja | Japanese | | zh | Chinese | | ko | Korean | | ar | Arabic | | tr | Turkish | | uk | Ukrainian | | sk | Slovak | | pl | Polish | | de | German | | fr | French | | it | Italian | | es | Spanish | | el | Greek | | hu | Hungarian | | nl | Dutch | | sv | Swedish | | ro | Romanian | | cs | Czech | | da | Danish | | pt | Portuguese (#1) | | vi | Vietnam (#2) |
All passed letters are analyzed to compare them between the English alphabet and the language specified in the --language parameter.
You can also use any of the translators individually by specifying the appropriate option at startup:
<table> <tr> <td><code>multranslate --translator Google --language tr</code> </td> <td><code>multranslate --translator DeepL --language de</code> </td> </tr> <tr> <td><img src=/image/google-tr.jpg width=600/></td> <td><img src=/image/deepl-de.jpg width=600/></td> </tr> <tr> <td><code>multranslate --translator Reverso --language it</code> </td> <td><code>multranslate --translator MyMemory --language es</code> </td> </tr> <tr> <td><img src=/image/reverso-it.jpg width=600/></td> <td><img src=/image/mymemory-es.jpg width=600/></td> </tr> </table>Hotkeys
Ctrl+<Enter/S>- translation of text without breaking to a new line.Ctrl+V- paste text from the clipboard (defined at the code level).Alt+C- copy text from the input field to clipboard.Alt+<1/2/3/4/5>- copying translation results from output window to the clipboard (for each translator, the key combination is indicated in brackets), and the selected form will change its color to green.Ctrl+<N/Z>- move to the previous entry in the translation history.- `Ctrl+<P/X
Related Skills
feishu-drive
346.8k|
things-mac
346.8kManage Things 3 via the `things` CLI on macOS (add/update projects+todos via URL scheme; read/search/list from the local Things database)
clawhub
346.8kUse the ClawHub CLI to search, install, update, and publish agent skills from clawhub.com
codebase-memory-mcp
1.2kHigh-performance code intelligence MCP server. Indexes codebases into a persistent knowledge graph — average repo in milliseconds. 66 languages, sub-ms queries, 99% fewer tokens. Single static binary, zero dependencies.
