Gnosis MCP
Zero-config MCP server for searchable documentation (SQLite default, PostgreSQL optional)
Install / Use
/learn @nicholasglazer/Gnosis MCPQuality Score
Category
Development & EngineeringSupported Platforms
README
<a href="#quick-start"><img src="https://raw.githubusercontent.com/nicholasglazer/gnosis-mcp/main/demo-hero.gif" alt="Gnosis MCP — ingest docs, search, view stats, serve" width="700"></a> <br> <sub>Ingest docs → Search with highlights → Stats overview → Serve to AI agents</sub>
</div>Without a docs server
- LLMs hallucinate API signatures that don't exist
- Entire files dumped into context — 3,000 to 8,000+ tokens each
- Architecture decisions buried across dozens of files
With Gnosis MCP
search_docsreturns ranked, highlighted excerpts (~600 tokens)- Real answers grounded in your actual documentation
- Works across hundreds of docs instantly
Features
- Zero config — SQLite by default,
pip installand go - Hybrid search — keyword (BM25) + semantic (local ONNX embeddings, no API key)
- Git history — ingest commit messages as searchable context (
ingest-git) - Web crawl — ingest documentation from any website via sitemap or link crawl
- Multi-format —
.md.txt.ipynb.toml.csv.json+ optional.rst.pdf - Auto-linking —
relates_tofrontmatter creates a navigable document graph - Watch mode — auto-re-ingest on file changes
- PostgreSQL ready — pgvector + tsvector when you need scale
Quick Start
pip install gnosis-mcp
gnosis-mcp ingest ./docs/ # loads docs into SQLite (auto-created)
gnosis-mcp serve # starts MCP server
That's it. Your AI agent can now search your docs.
Want semantic search? Add local embeddings — no API key needed:
pip install gnosis-mcp[embeddings]
gnosis-mcp ingest ./docs/ --embed # ingest + embed in one step
gnosis-mcp serve # hybrid search auto-activated
Test it before connecting to an editor:
gnosis-mcp search "getting started" # keyword search
gnosis-mcp search "how does auth work" --embed # hybrid semantic+keyword
gnosis-mcp stats # see what was indexed
<details>
<summary>Try without installing (uvx)</summary>
uvx gnosis-mcp ingest ./docs/
uvx gnosis-mcp serve
</details>
Web Crawl
<div align="center"> <img src="https://raw.githubusercontent.com/nicholasglazer/gnosis-mcp/main/demo-crawl.gif" alt="Gnosis MCP — crawl docs with dry-run, fetch, search, SSRF protection" width="700"> <br> <sub>Dry-run discovery → Crawl & ingest → Search crawled docs → SSRF protection</sub> </div> <br>Ingest docs from any website — no local files needed:
pip install gnosis-mcp[web]
# Crawl via sitemap (best for large doc sites)
gnosis-mcp crawl https://docs.stripe.com/ --sitemap
# Depth-limited link crawl with URL filter
gnosis-mcp crawl https://fastapi.tiangolo.com/ --depth 2 --include "/tutorial/*"
# Preview what would be crawled
gnosis-mcp crawl https://docs.python.org/ --dry-run
# Force re-crawl + embed for semantic search
gnosis-mcp crawl https://docs.sveltekit.dev/ --sitemap --force --embed
Respects robots.txt, caches with ETag/Last-Modified for incremental re-crawl, and rate-limits requests (5 concurrent, 0.2s delay). Crawled pages use the URL as the document path and hostname as the category — searchable like any other doc.
Git History
Turn commit messages into searchable context — your agent learns why things were built, not just what exists:
gnosis-mcp ingest-git . # current repo, all files
gnosis-mcp ingest-git /path/to/repo --since 6m # last 6 months only
gnosis-mcp ingest-git . --include "src/*" --max-commits 5 # filtered + limited
gnosis-mcp ingest-git . --dry-run # preview without ingesting
gnosis-mcp ingest-git . --embed # embed for semantic search
Each file's commit history becomes a searchable markdown document stored as git-history/<file-path>. The agent finds it via search_docs like any other doc — no new tools needed. Incremental re-ingest skips files with unchanged history.
Editor Integrations
Add the server config to your editor — your AI agent gets search_docs, get_doc, and get_related tools automatically:
{
"mcpServers": {
"docs": {
"command": "gnosis-mcp",
"args": ["serve"]
}
}
}
| Editor | Config file |
|--------|------------|
| Claude Code | .claude/mcp.json (or install as plugin) |
| Cursor | .cursor/mcp.json |
| Windsurf | ~/.codeium/windsurf/mcp_config.json |
| JetBrains | Settings > Tools > AI Assistant > MCP Servers |
| Cline | Cline MCP settings panel |
Add to .vscode/mcp.json (note: "servers" not "mcpServers"):
{
"servers": {
"docs": {
"command": "gnosis-mcp",
"args": ["serve"]
}
}
}
Also discoverable via the VS Code MCP gallery — search @mcp gnosis in the Extensions view.
For remote deployment, use Streamable HTTP:
gnosis-mcp serve --transport streamable-http --host 0.0.0.0 --port 8000
REST API
v0.10.0+ — Enable native HTTP endpoints alongside MCP on the same port.
gnosis-mcp serve --transport streamable-http --rest
Web apps can now query your docs over plain HTTP — no MCP protocol required.
| Endpoint | Description |
|----------|-------------|
| GET /health | Server status, version, doc count |
| GET /api/search?q=&limit=&category= | Search docs (auto-embeds with local provider) |
| GET /api/docs/{path} | Get document by file path |
| GET /api/docs/{path}/related | Get related documents |
| GET /api/categories | List categories with counts |
Environment variables:
| Variable | Description |
|----------|-------------|
| GNOSIS_MCP_REST=true | Enable REST API (same as --rest) |
| GNOSIS_MCP_CORS_ORIGINS | CORS allowed origins: * or comma-separated list |
| GNOSIS_MCP_API_KEY | Optional Bearer token auth |
Examples:
# Health check
curl http://127.0.0.1:8000/health
# Search
curl "http://127.0.0.1:8000/api/search?q=authentication&limit=5"
# With API key
curl -H "Authorization: Bearer sk-secret" "http://127.0.0.1:8000/api/search?q=setup"
Backends
| | SQLite (default) | SQLite + embeddings | PostgreSQL |
|---|---|---|---|
| Install | pip install gnosis-mcp | pip install gnosis-mcp[embeddings] | pip install gnosis-mcp[postgres] |
| Config | Nothing | Nothing | Set GNOSIS_MCP_DATABASE_URL |
| Search | FTS5 keyword (BM25) | Hybrid keyword + semantic (RRF) | tsvector + pgvector hybrid |
| Embeddings | None | Local ONNX (23MB, no API key) | Any provider + HNSW index |
| Multi-table | No | No | Yes (UNION ALL) |
| Best for | Quick start, keyword-only | Semantic search without a server | Production, large doc sets |
Auto-detection: Set GNOSIS_MCP_DATABASE_URL to postgresql://... and it uses PostgreSQL. Don't set it and it uses SQLite. Override with GNOSIS_MCP_BACKEND=sqlite|postgres.
pip install gnosis-mcp[postgres]
export GNOSIS_MCP_DATABASE_URL="postgresql://user:pass@localhost:5432/mydb"
gnosis-mcp init-db # create tables + indexes
gnosis-mcp ingest ./docs/ # load your markdown
gnosis-mcp serve
For hybrid semantic+keyword search, also enable pgvector:
CREATE EXTENSION IF NOT EXISTS vector;
Then backfill embeddings:
gnosis-mcp embed # via OpenAI (default)
gnosis-mcp embed --provider ollama # or use local Ollama
</details>
Claude Code Plugin
For Claude Code users, install as a plugin to get the MCP server plus slash commands:
claude plugin marketplace add nicholasglazer/gnosis-mcp
claude plugin install gnosis
This gives you:
| Component | What you get |
|-----------|-------------|
| MCP server | gnosis-mcp serve — auto-configured |
| /gnosis:search | Search docs with keyword or --semantic hybrid mode |
| /gnosis:status | Health check — connectivity, doc stats, troubleshooting |
| /gnosis:manage | CRUD — add, delete, update metadata, bulk embed |
The plugin works with both SQLite and PostgreSQL backends.
<details> <summary>Manual setup (without plugin)</summary>Add to .claude/mcp.json:
{
"mcpServers": {
"gnosis": {
"command": "gnosis-mcp",
"args": ["serve"]
}
}
}
For PostgreSQL, add "env": {"GNOSIS_MCP_DATABASE_URL": "postgresql://..."}.
Tools & Resources
Gnosis MCP exposes 6 tools and 3 resources over [MCP](https://mode
Related Skills
node-connect
328.6kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
openai-image-gen
328.6kBatch-generate images via OpenAI Images API. Random prompt sampler + `index.html` gallery.
prose
328.6kOpenProse VM skill pack. Activate on any `prose` command, .prose files, or OpenProse mentions; orchestrates multi-agent workflows.
claude-opus-4-5-migration
80.9kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
