SocratiCode
Enterprise-grade (40m+ lines) codebase intelligence in a zero-setup, private and local Claude Plugin or MCP: managed indexing, hybrid semantic search, polyglot code dependency graphs, and DB/API/infra knowledge. Benchmark: 61% less tokens, 84% fewer calls, 37x faster than standard AI grep.
Install / Use
/learn @giancarloerra/SocratiCodeQuality Score
Category
Development & EngineeringSupported Platforms
README
SocratiCode
<p align="center"> <a href="https://github.com/giancarloerra/socraticode/actions/workflows/ci.yml"><img src="https://github.com/giancarloerra/socraticode/actions/workflows/ci.yml/badge.svg" alt="CI"></a> <a href="LICENSE"><img src="https://img.shields.io/badge/License-AGPL--3.0-blue.svg" alt="License: AGPL-3.0"></a> <a href="https://www.npmjs.com/package/socraticode"><img src="https://img.shields.io/npm/v/socraticode.svg" alt="npm version"></a> <a href="https://nodejs.org/"><img src="https://img.shields.io/badge/node-%3E%3D18-brightgreen.svg" alt="Node.js >= 18"></a> <a href="https://github.com/giancarloerra/socraticode"><img src="https://img.shields.io/github/stars/giancarloerra/socraticode?style=social" alt="GitHub stars"></a> </p> <p align="center"> <a href="#claude-code-plugin-recommended-for-claude-code-users"><img src="https://img.shields.io/badge/Claude_Code-Install_Plugin-CC785C?style=flat-square&logoColor=white" alt="Install Claude Code Plugin"></a> <a href="https://insiders.vscode.dev/redirect/mcp/install?name=socraticode&config=%7B%22command%22%3A%22npx%22%2C%22args%22%3A%5B%22-y%22%2C%22socraticode%22%5D%7D"><img src="https://img.shields.io/badge/VS_Code-Install_MCP_Server-0098FF?style=flat-square&logo=visualstudiocode&logoColor=white" alt="Install in VS Code"></a> <a href="https://insiders.vscode.dev/redirect/mcp/install?name=socraticode&config=%7B%22command%22%3A%22npx%22%2C%22args%22%3A%5B%22-y%22%2C%22socraticode%22%5D%7D&quality=insiders"><img src="https://img.shields.io/badge/VS_Code_Insiders-Install_MCP_Server-24bfa5?style=flat-square&logo=visualstudiocode&logoColor=white" alt="Install in VS Code Insiders"></a> <a href="cursor://anysphere.cursor-deeplink/mcp/install?name=socraticode&config=eyJjb21tYW5kIjoibnB4IiwiYXJncyI6WyIteSIsInNvY3JhdGljb2RlIl19"><img src="https://img.shields.io/badge/Cursor-Install_MCP_Server-F14C28?style=flat-square&logo=cursor&logoColor=white" alt="Install in Cursor"></a> </p>"There is only one good, knowledge, and one evil, ignorance." — Socrates
Give any AI instant automated knowledge of your entire codebase (and infrastructure) — at scale, zero configuration, fully private, completely free.
<p align="center"> Kindly sponsored by <a href="https://altaire.com">Altaire Limited</a> </p>If SocratiCode has been useful to you, please ⭐ star this repo — it helps others discover it — and share it with your dev team and fellow developers!
One thing, done well: deep codebase intelligence — zero setup, no bloat, fully automatic. SocratiCode gives AI assistants deep semantic understanding of your codebase — hybrid search, polyglot code dependency graphs, and searchable context artifacts (database schemas, API specs, infra configs, architecture docs). Zero configuration — add it to any MCP host, or install the Claude Code plugin for built-in workflow skills. It manages everything automatically.
Production-ready, battle-tested on enterprise-level large repositories (up to and over ~40 million lines of code). Batched, automatic resumable indexing checkpoints progress — pauses, crashes, restarts, and interruptions don't lose work. The file watcher keeps the index automatically updated at every file change and across sessions. Multi-agent ready — multiple AI agents can work on the same codebase simultaneously, sharing a single index with automatic coordination and zero configuration.
Private and local by default — Docker handles everything, no API keys required, no data leaves your machine. Cloud ready for embeddings (OpenAI, Google Gemini) and Qdrant, and a full suite of configuration options are all available when you need them.
The first Qdrant‑based MCP/Claude Plugin/Skill that pairs auto‑managed, zero‑config local Docker deployment with AST‑aware code chunking, hybrid semantic + BM25 (RRF‑fused) code search, polyglot dependency graphs with circular‑dependency visualization, and searchable infra/API/database artifacts in a single focused, zero-config and easy to use code intelligence engine.
<p align="center"> <a href="https://www.star-history.com/?repos=giancarloerra%2Fsocraticode&type=date&logscale=&legend=top-left"> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/image?repos=giancarloerra/socraticode&type=date&theme=dark&legend=top-left" /> <source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/image?repos=giancarloerra/socraticode&type=date&legend=top-left" /> <img alt="Star History Chart" src="https://api.star-history.com/image?repos=giancarloerra/socraticode&type=date&legend=top-left" /> </picture> </a> </p>Benchmarked on VS Code (2.45M lines): SocratiCode uses 61% less context, 84% fewer tool calls, and is 37x faster than grep‑based exploration — tested live with Claude Opus 4.6. See the full benchmark →
Contents
- Quick Start
- Why SocratiCode
- Features
- Prerequisites
- Example Workflow
- Agent Instructions
- Configuration
- Language Support
- Ignore Rules
- Context Artifacts
- Environment Variables
- Docker Resources
- Testing
- Why Not Just Grep?
- FAQ
- License
Quick Start
Only Docker (running) required.
One-click install — Claude Code, VS Code and Cursor,:
All MCP hosts — add the following to your mcpServers (Claude Desktop, Windsurf, Cline, Roo Code) or servers (VS Code project-local .vscode/mcp.json) config:
"socraticode": {
"command": "npx",
"args": ["-y", "socraticode"]
}
Claude Code — install the plugin (recommended, includes workflow skills for best results):
From your shell:
claude plugin marketplace add giancarloerra/socraticode
claude plugin install socraticode@socraticode
Or from within Claude Code:
/plugin marketplace add giancarloerra/socraticode
/plugin install socraticode@socraticode
Auto-updates: After installing, enable automatic updates by opening
/plugin→ Marketplaces → selectsocraticode→ Enable auto-update.
Or as MCP only (without skills):
claude mcp add socraticode -- npx -y socraticode
Updating:
npxcaches the package after the first run. To get the latest version, clear the cache and restart your MCP host:rm -rf ~/.npm/_npx && claude mcp restart socraticode. Alternatively, usenpx -y socraticode@latestin your config to always check for updates on startup (slightly slower).
OpenAI Codex CLI — add to ~/.codex/config.toml:
[mcp_servers.socraticode]
command = "npx"
args = ["-y", "socraticode"]
Restart your host. On first use SocratiCode automatically pulls Docker images, starts its own Qdrant and Ollama containers, and downloads the embedding model — one-time setup, ~5 minutes depending on your connection. After that, it starts in seconds.
First time on a project — ask your AI: "Index this codebase". Indexing runs in the background; ask "What is the codebase index status?" to monitor progress. Depending on codebase size and whether you're using GPU-accelerated Ollama or cloud embeddings, first-time indexing can take anywhere from a few seconds to a few minutes (it takes under 10 minutes to first-index +3 million lines of code on a Macbook Pro M4). Once complete it doesn't need to be run again, you can search, explore the dependency graph, and query context artifacts.
Every time after that — just use the tools (search, graph, etc.). On server startup SocratiCode automatically detects previously indexed projects, restarts the file watcher, and runs an incremental update to catch any changes made while the server was down. If indexing was interrupted, it resumes automatically from the last checkpoint. You can also explicitly start or restart the watcher with codebase_watch { action: "start" }.
macOS / Windows on large codebases: Docker containers can't use the GPU. For medium-to-large repos, install native Ollama (auto-detected, no config change needed) for Metal/CUDA acceleration, or use OpenAI embeddings for speed without a local install. Full details.
Recommended: For best results, add the Agent Instructions to your AI assistant's system prompt or project instru
