SkillAgentSearch skills...

Transmart

Automate your i18n localization with AI

Install / Use

/learn @Quilljou/Transmart
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<p align="center"> <img src="./assets/logo.png" /> </p> <p align="center"> <b>Transmart - Automate your i18n localization with AI.</b> </p>

English | 简体中文

alt

npm Open in CodeSandbox

Transmart is an open-source developer tool that utilizes ChatGPT to automate i18n translation. Given a base language and specifying all the languages that need to be output, running it will generate all i18n locale files.

It consists of two parts: Cli and Core. Core is the NodeJS core implementation of Transmart, while Cli is a command-line tool that encapsulates Core. In most cases, only Cli is used.

This project is currently under active development,PRs are welcome,reach me at Twitter

Features

  • [x] Supports large size files,don't worry about the 4096 tokens limit
  • [x] Supports all languages that can be displayed using Intl.DisplayNames and can be processed by ChatGPT.
  • [x] Supports override AI translated values
  • [x] Supports i18next
  • [ ] Supports vue-i18n
  • [x] Supports Chrome.i18n
  • [x] Supports Glob namespace matching
  • [x] Supports customizing OpenAI Model、API endpoint
  • [ ] Supports custom locale file structure
  • [ ] Supports iOS
  • [ ] Supports Android

Setup

Transmart requires Node version 13 or higher.

1. Installation

To install Transmart, run:

npm install @transmart/cli -D

# or

yarn add @transmart/cli

2. Project setup

First, create a transmart.config.js file in the root of your project. or any others file format cosmiconfig can search for

transmart.config.js

module.exports = {
  baseLocale: 'en',
  locales: ['fr', 'jp', 'de'],
  localePath: 'public/locales',
  openAIApiKey: 'your-own-openai-api-key',
  overrides: {
    'zh-CN': {
      common: {
        create_app: 'Create my Application',
      },
    },
  },
}

All Options Reference

3. Translate.

Add transmart command to your npm scripts

{
  "translate": "transmart"
}

And then execute

npm run translate

Or you can execute directly with npx prefix in command line

npx transmart

If you are not satisfied with the result of AI translation,use overrides option to overwrite the generated JSON partially

🎉🎉 Enjoy i18n

Examples

Options

| Name | Type | Description | Required | |-------------------------|------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| :------: | | baseLocale | string | The language that Transmart will use as translation ref. | Yes | | locales | string[] | All languages that need to be translated | Yes | | localePath | string | Where you store your locale files | Yes | | openAIApiKey | string | The OpenAI API Key. | Yes | | context | string | Provide some context for a more accurate translation. | No | | openAIApiModel | string | OpenAI API model, default to gpt-3.5-turbo-16k-0613 | No | | overrides | Record<string, Record<string, Record<string, any>>> | used to overwrite the generated JSON if you are not satisfied with the result of AI translation (locale-namespace-key:value) | No | | namespaceGlob | string|string[] | Glob for namespace(s) to process, useful to include or exclude some files, learn more glob | No | | openAIApiUrl | string | Optional base url of OpenAI API, useful with proxy | No | | openAIApiUrlPath | string | Optional URL endpoint of OpenAI API, useful with proxy | No | | modelContextLimit | number | Optional max context window that the model supports. For example for gpt-4-32k, the context is 32768 tokens. Default to 4096 (gpt-3.5-turbo) | No | | modelContextSplit | number | Optional ratio to split between number of input / output tokens. For example, if the input language is English and output is Spanish, you may expect 1 input token to produce 2 output tokens. In this case, the variable is set to 1/2. By default, modelContextSplit is set to 1/1 | No | | systemPromptTemplate | function | (For advanced usage) Custom prompt template. See "translate.ts" for the default prompt. | No | | additionalReqBodyParams | any | (For advanced usage) Custom parameters to be passed into request body. Useful if you use a self-hosted model and you want to customize model parameters. For example, see llama.cpp server example | No | | singleFileMode | boolean | singleFileMode will use a single file for all namespaces. For example, if you have a single file en.json and you want to translate it to zh.json, you can set singleFileMode to true. This will translate it to zh.json . In this mode, the namespace will be ignored and set to app. | No |

Contributing

To contribute to Transmart,refer to contributing.md

Inspired by

  • https://chatgpt-i18n.vercel.app/
  • https://twitter.com/forgebitz/status/1634100746617597955
  • https://github.com/yetone/openai-translator
View on GitHub
GitHub Stars158
CategoryDevelopment
Updated23d ago
Forks17

Languages

TypeScript

Security Score

100/100

Audited on Mar 11, 2026

No findings