Zuban
Python Type Checker / Language Server
Install / Use
/learn @zubanls/ZubanREADME
Zuban
Zuban is a high-performance Python Language Server and type checker implemented in Rust, by the author of Jedi. Zuban is 20–200× faster than Mypy, while using roughly half the memory and CPU compared to Ty and Pyrefly. It offers both a PyRight-like mode and a Mypy-compatible mode, which behaves just like Mypy; supporting the same config files, command-line flags, and error messages.
Most important LSP features are supported. Features include diagnostics, completions, goto, references, rename, hover and document highlights.
Zuban passes over 95% of Mypy’s relevant test suite and offers comprehensive support for Python's type system.
Installation / Usage
pip install zuban # Installation
zuban check # PyRight-like checking
zuban mypy # Mypy compatibility mode
zmypy # An alias for zuban mypy
zuban server # An LSP server
If you want Zuban to pick up your dependencies, please activate the virtual env first.
Local Installation
You can install zuban locally by running:
pip install maturin
git clone --recursive https://github.com/zubanls/zuban
bash scripts/install-locally.sh
Note that your build will not properly work if submodules are not cloned.
License
This project is dual licensed:
- Open Source License: GNU Affero General Public License v3.0 (AGPL-3.0). You may use, modify, and distribute this project under the terms of the AGPL-3.0.
- Commercial License: Available for organizations that prefer not to comply with the AGPL. Contact info (at) zubanls.com for commercial licensing options.
Related Skills
node-connect
346.4kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
107.2kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
107.2kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
346.4kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
