Blz
Local-first, line-accurate search for blazing-fast lookups of llms.txt documentation. Human-friendly, Agent-ready.
Install / Use
/learn @outfitter-dev/BlzQuality Score
Category
Development & EngineeringSupported Platforms
README
BLZ ※
blaze /bleɪz/ (verb, noun)
- verb – Move or proceed at high speed; achieve something rapidly
- noun – A trail marker, typically painted on trees with specific colors and patterns; a mark to guide explorers on their journey
- abbr. – BLZ – A local-first search tool that indexes llms.txt documentation for instant, line-accurate retrieval
What is BLZ?
A Rust + Tantivy-based CLI tool that downloads, parses, and indexes llms.txt files locally to enable fast documentation search with line-accurate retrieval.
Quick Start
# Install (one line)
curl -fsSL https://blz.run/install.sh | sh
# Add Bun's docs
blz add bun https://bun.sh/llms.txt
# Search (results in 6ms)
blz find "test runner"
# Or just: blz "test runner"
# Browse documentation structure
blz toc bun --tree -H 1-2
# Pull exact lines (citations auto-detected)
blz find bun:304-324 --json
# Or: blz bun:304-324 --json
MCP Server Setup
Enable BLZ in your AI coding assistant with one command:
# Claude Code
claude mcp add blz blz mcp-server --scope user
# Cursor, Windsurf, and others
# See detailed setup: docs/mcp/SETUP.md
What you'll see:
✓ Added bun (1,926 headings, 43,150 lines) in 890ms
Search results for 'test runner' (6ms):
1. bun:304-324 (score: 92%)
📍 Bun Documentation > Guides > Test runner
### Test runner
Bun includes a fast built-in test runner...
Docs
- Documentation index – Overview of every guide, reference, and technical deep dive.
- Quickstart guide – Install BLZ and run your first searches in minutes.
- Agent playbook – Best practices for using BLZ inside AI workflows.
- Architecture overview – Core components, storage layout, and performance notes.
What's llms.txt?
llms.txt is a simple Markdown standard for making documentation accessible to AI agents. llms-full.txt is an expanded version that includes all documentation for a project.
Why they're great:
- Comprehensive documentation that's kept up to date
- Single file in a standardized format makes for easy retrieval and indexing
The challenge:
- They're huge (12K+ lines, 200K+ tokens)
- Too context-heavy for agents to use directly
- Keeping them up to date is manual work
Why BLZ?
BLZ indexes llms.txt documentation files locally:
- 6ms search across locally saved docs (vs. seconds for web requests)
- Exact line citations (e.g.,
bun:304-324) for copy-paste accuracy - Works offline after initial download
- Smart updates with HTTP caching (only fetches when changed)
The Problem
Projects publish complete docs as llms-full.txt files, but:
- They're massive (12K+ lines, 200K+ tokens)
- Too context-heavy for agents to use directly
But what about MCP servers for searching docs?
- They're great, and we use them too! but...
- Results can take up a lot of an agent's context window
- May require multiple searches to find critical info
BLZ's Solution
Cache & index llms.txt locally → search in ms → retrieve only needed lines
With BLZ, agents can get the docs they need in a fraction of the time, and context.
See docs/architecture/PERFORMANCE.md for detailed benchmarks and methodology.
Features
- Unified find command: Smart pattern detection automatically routes queries to search or retrieval
- Heading level filtering: Search within specific heading depths with
-Hflag (e.g.,-H <=2for top-level sections) - One-line installation: Install script with SHA-256 verification and platform detection
- Fast search: 6ms typical search latency with exact line citations
- Offline-first: Works offline after initial download, smart updates with HTTP caching
- Clipboard support: Copy search results directly with
--copyflag - Source insights: Commands for visibility (
blz stats,blz info,blz history) - Enhanced TOC Navigation
- Heading level filtering (
-H 1,-H 2-3,-H <=2,-H >3) - Tree view visualization with box-drawing characters (
--tree) - Multi-source TOC browsing (
--source bun,reactor--all) - Backward compatible with
--max-depth
- Heading level filtering (
- Direct CLI integration: IDE agents run commands directly for instant JSON results
- MCP server: stdio-based integration via official Rust SDK for deep AI assistant integration
Language Filtering
BLZ automatically filters non-English content from multilingual documentation sources:
- Enabled by default: Reduces storage by 60-90% for multilingual sources
- Opt-out available: Use
--no-language-filterto keep all languages - Retroactive: Use
blz refresh <source> --reindex --filteron existing sources
# Add source with filtering (default)
blz add anthropic https://docs.anthropic.com/llms-full.txt
# Add without filtering
blz add anthropic https://docs.anthropic.com/llms-full.txt --no-language-filter
# Fix existing source
blz refresh anthropic --reindex --filter
See Language Filtering Migration Guide for details.
Installation
Quick Install (macOS/Linux)
curl -fsSL https://blz.run/install.sh | sh
This installs the latest release to ~/.local/bin. Override the target location with BLZ_INSTALL_DIR=/path, or pin a version via BLZ_VERSION=v0.4.1. Run sh install.sh --help for additional options (e.g., --dir, --version, --dry-run).
From Source
# Clone and install
git clone https://github.com/outfitter-dev/blz
cd blz
cargo install --path crates/blz-cli
# Or install directly from GitHub
cargo install --git https://github.com/outfitter-dev/blz --branch main blz-cli
# Optional dev build (installs `blz-dev` only)
./install-dev.sh --root "$HOME/.local/share/blz-dev"
# See docs/development/README.md for full local workflow guidance.
Shell Setup
Fish
# Add to PATH
set -gx PATH $HOME/.cargo/bin $PATH
# Install completions
blz completions fish > ~/.config/fish/completions/blz.fish
Bash/Zsh
# Add to PATH
export PATH="$HOME/.cargo/bin:$PATH"
# Install completions (Bash)
blz completions bash > ~/.local/share/bash-completion/completions/blz
# Install completions (Zsh)
blz completions zsh > ~/.zsh/completions/_blz
# Install completions (Elvish)
blz completions elvish > ~/.local/share/elvish/lib/blz.elv
Documentation
Comprehensive documentation is available in the docs/ directory:
Getting Started
- Quick Start - Installation and first steps
- CLI Overview - Installation, flags, and binaries
- How-To Guide - Task-oriented "I want to…" solutions
CLI Reference
- Command Reference - Complete command catalog
- Search Guide - Search syntax and advanced patterns
- Managing Sources - Adding and organizing documentation
- Configuration - Global, per-source, and env settings
- Shell Integration - Completions for Bash, Zsh, Fish, PowerShell, Elvish
Technical Details
- Storage Layout - Directory structure and disk management
- Architecture - System design and performance
- Performance - Benchmarks and optimization
Usage For AI Agents
- Quick primer:
blz --promptin your terminal - Programmatic CLI docs:
blz docs export --json(legacy:blz docs --format json) - Detailed instructions: See
docs/agents/use-blz.md(copy into CLAUDE.md or AGENTS.md)
Typical Agent Flow
# Get caught up with blz's features and capabilities
blz --prompt
# List available sources
blz list --status --json
# Add sources non-interactively
blz add bun https://bun.sh/llms.txt -y
# Search Bun docs and capture the first alias:lines citation
span=$(blz "test runner" --json | jq -r '.results[0] | "\(.alias):\(.lines)"')
# Retrieve the exact line with 5 lines of context on either side
blz get "$span" -C 5 --json
# Need more than one range? Comma-separate them after the alias
blz get bun:41994-42009,42010-42020 --json
# Want the full heading section? Expand with --context all (and cap the output)
blz get bun:41994-42009 --context all --max-lines 80 --json
IDE Agent Integration
Direct CLI Usage (Recommended)
IDE agents can run blz commands directly for millisecond responses:
# Search for documentation
blz "test runner" -s bun --json
# Get exact line ranges
blz get bun:423-445
# Merge multiple spans for the same source (comma-separated)
blz get bun:41994-42009,42010-42020 --json
# Expand to the entire heading block when the agent needs full prose
blz get bun:41994-42009 --context all --max-lines 80 --json
# List all indexed sources (note: list returns array; search returns object with .results)
blz list --json | jq 'length'
The JSON output is designed for easy parsing by agents:
{
"alias": "bun",
"file": "llms.txt",
"headingPath": ["CLI", "Flags"],
"lines": "311-339",
"snippet": "--concurrency<N> ...",
"score": 12.47,
"sourceUrl": "https://bun.sh/llms.txt#L311-L339",
"checksum": "sha256:..."
}
MCP Server
BLZ provides a Model Context Protocol server for deep integration with AI coding assistants.
Launch the server:
blz mcp-server
The MCP server exposes:
findtool: Search and retrieve documentation with exact line citationslist-sourcestool: Discover installed and registry sourcessource-addtool: Add documentation sources- Resources: Browse source metadata via
blz://sources/{alias}URIs - Prompts: Guided workflows like
discover-docs
Quick example:
Related Skills
node-connect
349.2kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
109.5kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
349.2kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
349.2kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
