Bernstein
Declarative Agent Orchestration. Ship while you sleep.
Install / Use
/learn @chernistry/BernsteinQuality Score
Category
Development & EngineeringSupported Platforms
README
Declarative agent orchestration for engineering teams.
One YAML. Multiple coding agents. Ship while you sleep.
<picture> <source media="(prefers-color-scheme: dark)" srcset="docs/assets/tui.svg"> <source media="(prefers-color-scheme: light)" srcset="docs/assets/tui.svg"> <img alt="Bernstein TUI — live task dashboard" src="docs/assets/tui.svg" width="700"> </picture> <p align="center"><strong>Web dashboard</strong> — real-time task monitoring, cost tracking, agent status</p> <p align="center"><img alt="Bernstein Web Dashboard" src="docs/assets/web-dashboard.png" width="700" style="border-radius:8px"></p>Homepage | Documentation | Getting Started | Known Limitations
</div>If you're running one agent at a time, you're leaving performance on the table. Bernstein takes a goal, breaks it into tasks, assigns them to AI coding agents running in parallel, verifies the output, and commits the results. You come back to working code, passing tests, and a clean git history.
No framework to learn. No vendor lock-in. Works with Claude Code, Codex, Gemini CLI, Cursor, Aider, Amp, Roo Code, Goose, Qwen, and any CLI tool that accepts a prompt flag.
<picture> <source media="(prefers-color-scheme: dark)" srcset="docs/assets/architecture.svg"> <source media="(prefers-color-scheme: light)" srcset="docs/assets/architecture.svg"> <img alt="Architecture" src="docs/assets/architecture.svg" width="650"> </picture>Think of it as what Kubernetes did for containers, but for AI coding agents. You declare a goal. The control plane decomposes it into tasks. Short-lived agents execute them in isolated git worktrees -- like pods. A janitor verifies the output before anything lands.
pip install bernstein # any platform
# or
pipx install bernstein # isolated install
# or
uv tool install bernstein # fastest (Rust-based)
# or
brew tap chernistry/bernstein && brew install bernstein # macOS / Linux
# or
sudo dnf copr enable alexchernysh/bernstein && sudo dnf install bernstein # Fedora / RHEL
# or
npx bernstein-orchestrator # npm wrapper (requires Python 3.12+)
# Run:
bernstein -g "Add JWT auth with refresh tokens, tests, and API docs"
1.78× faster than single-agent execution, verified on internal benchmarks. See benchmarks for methodology and reproduction steps.
What it is
Bernstein is a deterministic orchestrator for CLI coding agents. It schedules tasks in parallel across any installed agent — Claude Code, Codex, Cursor, Gemini, Aider, and more — with git worktree isolation, janitor-verified output, and file-based state you can inspect, back up, and recover from. No vendor lock-in. No framework to learn. Your agents, your models, your backlog.
5-minute setup
# 1. Install (pick one — full list in the install block above)
pipx install bernstein
# 2. Init your project (creates .sdd/ workspace + bernstein.yaml)
cd your-project
bernstein init
# 3. Run — pass a goal inline or let bernstein.yaml guide the run
bernstein -g "Add rate limiting and improve test coverage"
That's it. Your agents spawn, work in parallel, verify their output, and exit. Watch progress in the terminal dashboard.
Supported agents
Bernstein ships with adapters for 12 CLI agents. If you have any of these installed, Bernstein uses them — no API key plumbing required:
| Agent | Models | Install |
|-------|--------|---------|
| Aider | Any OpenAI/Anthropic-compatible model | pip install aider-chat |
| Amp | opus 4.6, gpt-5.4 | brew install amp |
| Claude Code | opus 4.6, sonnet 4.6, haiku 4.5 | npm install -g @anthropic-ai/claude-code |
| Codex CLI | gpt-5.4, o3, o4-mini | npm install -g @openai/codex |
| Cursor | sonnet 4.6, opus 4.6, gpt-5.4 | Cursor app (sign in via app) |
| Gemini CLI | gemini-3-pro, 3-flash | npm install -g @google/gemini-cli |
| Goose | Any provider | Install Goose CLI |
| Kilo | Configurable | npm install -g kilo |
| Kiro | Multi-provider | Install Kiro CLI |
| OpenCode | Multi-provider | Install OpenCode CLI |
| Qwen | qwen3-coder, qwen-max | npm install -g qwen-code |
| Roo Code | opus 4.6, sonnet 4.6, gpt-4o | VS Code extension (headless CLI) |
Prefer a different agent? Bring your own -- the generic adapter accepts any CLI tool with a --prompt-flag interface. Mix models in the same run: cheap free-tier agents for boilerplate, heavy models for architecture.
[!TIP] Run
bernstein --headlessfor CI pipelines -- no TUI, structured JSON output, non-zero exit on failure.
Shipped features
Only capabilities that ship with v1.4.11. Full matrix at FEATURE_MATRIX.md.
- Deterministic scheduling — zero LLM tokens on coordination. The orchestrator is plain Python.
- Parallel execution — spawn multiple agents across roles (backend, qa, docs, security) simultaneously.
- Git worktree isolation — every agent works in its own branch. Your main branch stays clean.
- Janitor verification — concrete signals (tests pass, files exist, no regressions) before anything lands.
- Quality gates — lint, type-check, PII scan, and mutation testing run automatically after completion.
- Plan files — multi-stage YAML with stages and steps, like Ansible playbooks (
bernstein run plan.yaml). - Cost tracking — per-model spend, tokens, and duration (
bernstein cost). - Live dashboards — terminal TUI (
bernstein live) and browser UI (bernstein dashboard). - Self-evolution — analyze metrics, propose improvements, sandbox-test, and auto-apply what passes (
--evolve). - CI autofix — parse failing CI logs, create fix tasks, route to the right agent (
bernstein ci fix <url>). - Circuit breaker — halt agents that repeatedly violate purpose or crash.
- Token growth monitor — detect runaway token consumption and intervene automatically.
- Cross-model verification — route completed task diffs to a different model for review.
- Audit trail — HMAC-chained tamper-evident logs with Merkle seal verification.
- Pluggy plugin system — hook into any lifecycle event.
- Multi-repo workspaces — orchestrate across multiple git repositories as one workspace.
- Cluster mode — central server + remote worker nodes for distributed execution.
- MCP server mode — run Bernstein as an MCP tool server for other agents.
- 12 agent adapters — Claude, Codex, Cursor, Gemini, Aider, Amp, Roo Code, Kiro, Kilo, OpenCode, Qwen, Goose, plus a generic catch-all.
Install
All methods install the same bernstein CLI.
| Method | Command |
|--------|---------|
| pip | pip install bernstein |
| pipx | pipx install bernstein |
| uv | uv tool install bernstein |
| Homebrew | brew tap chernistry/bernstein && brew install bernstein |
| Fedora / RHEL | sudo dnf copr enable alexchernysh/bernstein && sudo dnf install bernstein |
| npm (thin wrapper) | npx bernstein-orchestrator or npm i -g bernstein-orchestrator |
The npm wrapper requires Python 3.12+ on the system -- it delegates to pipx/uvx/python under the hood.
COPR targets: Fedora 41, 42 (x86_64, aarch64), EPEL 9, 10.
Editor extensions
| Editor | Install | |--------|-----
