SkillAgentSearch skills...

FileScopeMCP

Analyzes your codebase identifying important files based on dependency relationships. Generates diagrams and importance scores per file, helping AI assistants understand the codebase. Automatically parses popular programming languages such as Python, C, C++, Rust, Zig, Lua.

Install / Use

/learn @admica/FileScopeMCP
About this skill

Quality Score

0/100

Supported Platforms

Claude Code
Cursor

README

FileScopeMCP (Model Context Protocol) Server

Understand your codebase — ranked, related, summarized, and kept up to date automatically.

<!-- Add Badges Here (e.g., License, Version, Build Status) -->

Build Status Node.js License

Trust Score

A TypeScript-based MCP server and standalone daemon that ranks files by importance, tracks bidirectional dependencies, detects circular dependency chains, autonomously maintains AI-generated summaries, concepts, and change impact assessments — and keeps all of that metadata fresh in the background as your codebase changes.

Overview

FileScopeMCP is a fully autonomous file intelligence platform. Once pointed at a project it:

  1. Scans the codebase via a streaming async directory walker and builds a dependency graph with 0–10 importance scores for every file.
  2. Watches the filesystem. When files change, it incrementally updates dependency lists and importance scores, then detects semantic changes via tree-sitter AST diffing (TS/JS) or LLM-powered diff analysis (all other languages), and propagates staleness through the dependency graph via the cascade engine.
  3. A standalone LLM broker process communicates with Ollama (or any OpenAI-compatible endpoint) over a Unix domain socket, auto-generating summaries, key concepts, and change impact assessments for stale files — keeping structured metadata current without any manual work.
  4. On-demand mtime-based freshness checks detect files that changed while the server was down, so metadata is never silently stale.

All of this information is exposed to your AI assistant through the Model Context Protocol so it always has accurate, up-to-date context about your codebase structure.

Features

  • File Importance Ranking

    • Rank every file on a 0–10 scale based on its role in the dependency graph.
    • Weighted formula considers incoming dependents, outgoing dependencies, file type, location, and name significance.
    • Instantly surface the most critical files in any project.
  • Dependency Tracking

    • Bidirectional dependency relationships: which files import a given file (dependents) and which files it imports (dependencies).
    • Distinguishes local file dependencies from package dependencies.
    • Multi-language support: Python, JavaScript, TypeScript, C/C++, Rust, Lua, Zig, PHP, C#, Java, Go, Ruby.
  • Circular Dependency Detection

    • Detects all strongly connected components (circular dependency groups) using iterative Tarjan's SCC algorithm.
    • Project-wide scan via detect_cycles or per-file query via get_cycles_for_file.
    • Identifies exactly which files participate in each cycle, helping untangle tight coupling.
  • Autonomous Background Updates

    • Filesystem watcher detects add, change, and unlink events in real time.
    • Incremental updates: re-parses only the affected file, diffs old vs. new dependency lists, patches the reverse-dependency map, and recalculates importance — no full rescan.
    • Startup integrity sweep detects files added, deleted, or modified while the server was offline and heals the database before accepting requests.
    • Per-file mtime-based lazy validation on read — see Freshness Validation.
    • All mutations are serialized through an async mutex to prevent concurrent corruption.
    • Semantic change detection classifies what changed before triggering cascade — avoids unnecessary LLM calls.
  • File Summaries

    • Background LLM broker auto-generates summaries for files after they change.
    • Manual override via set_file_summary — your summary is preserved until the file changes again.
    • Summaries persist across server restarts in SQLite.
  • SQLite Storage

    • All data stored in .filescope/data.db (SQLite, WAL mode) inside the per-repo directory.
    • Type-safe schema via drizzle-orm: files, file_dependencies, schema_version tables.
    • Transparent auto-migration: existing JSON tree files are automatically imported on first run — no manual migration step.
  • Semantic Change Detection

    • tree-sitter AST diffing for TypeScript and JavaScript files — fast, accurate, and token-free.
    • Classifies changes as: body-only (function internals only), exports-changed (public API changed), types-changed (type signatures changed), or unknown.
    • LLM-powered diff fallback for all other languages (Python, Rust, C/C++, Go, Ruby, etc.).
    • Change classification drives the cascade engine — body-only changes skip dependent propagation entirely.
  • Cascade Engine

    • BFS staleness propagation through the dependency graph when exports or types change.
    • Per-field granularity: marks summary, concepts, and change_impact fields stale independently.
    • Circular dependency protection via visited set — no infinite loops.
    • Depth cap of 10 levels prevents runaway propagation on deeply nested graphs.
  • LLM Broker

    • Standalone broker process owns all Ollama (or OpenAI-compatible) communication — auto-spawned when the first MCP instance connects.
    • Communicates over a Unix domain socket at ~/.filescope/broker.sock using NDJSON protocol.
    • In-memory priority queue: interactive (tier 1) > cascade (tier 2) > background (tier 3).
    • Per-repo token usage stats persisted to ~/.filescope/stats.json.
    • LLM enabled by default (llm.enabled: true) — broker is auto-spawned on first connect.
  • Custom Exclusion Patterns

    • .filescopeignore file in the project root — uses gitignore syntax (via the ignore package) to exclude files from scanning and watching.
    • exclude_and_remove MCP tool — adds glob patterns at runtime; patterns are persisted to .filescope/config.json so they survive restarts.
    • ~90 built-in default exclusion patterns covering all major languages and toolchains (version control, Node/JS/TS, Python, Rust, Go, C/C++, Java/Kotlin/Gradle, C#/.NET, Zig, build outputs, logs/temp, OS files, IDE/editor, environment/secrets, caches). You only need to add project-specific patterns.
  • Daemon Mode

    • Runs as a standalone daemon (--daemon --base-dir=<path>) for 24/7 operation without an MCP client connected.
    • PID file guard (.filescope/instance.pid) prevents concurrent daemons on the same project.
    • Graceful shutdown on SIGTERM/SIGINT — flushes pending jobs before exit.
    • File-only logging to .filescope-daemon.log in the project root — no stdout pollution.

Prerequisites

  • Node.js 22+ — required. Earlier versions may work but are untested. Download from nodejs.org.
  • npm — comes with Node.js.
  • Native build tools (usually optional) — better-sqlite3 and tree-sitter ship prebuilt binaries for most platforms. If prebuilds aren't available for your OS/arch, npm install will fall back to compiling from source, which requires:
    • Linux: python3, make, gcc (e.g., sudo apt install build-essential python3)
    • macOS: Xcode Command Line Tools (xcode-select --install)
    • Windows: Visual Studio Build Tools with C++ workload

Installation

  1. Clone this repository

  2. Build and register:

    Linux / macOS / WSL:

    ./build.sh
    

    Windows:

    build.bat
    

    Both scripts will:

    • Install npm dependencies
    • Compile TypeScript to dist/
    • Generate mcp.json for Cursor AI
    • Register the server with Claude Code (~/.claude.json)

Setting Up Local LLM (Optional)

FileScopeMCP includes an automated setup script for Ollama:

./setup-llm.sh

This script will:

  • Install Ollama if not present (supports Linux, macOS, WSL)
  • Detect GPU hardware (NVIDIA, AMD, Metal) and configure acceleration
  • Pull the default model (qwen2.5-coder:14b)
  • Verify the installation

To check status or use a different model:

./setup-llm.sh --status           # Check Ollama and model status
./setup-llm.sh --model codellama  # Pull a different model

Claude Code

The build script registers FileScopeMCP automatically. To register (or re-register) without rebuilding:

./install-mcp-claude.sh

The server is registered globally and auto-initializes to the current working directory on startup. No configuration file or manual initialization is needed. When you start a Claude Code session in your project directory, FileScopeMCP automatically scans the codebase, starts the file watcher, and runs the startup integrity sweep.

Use set_base_directory if you want to analyze a different directory or subdirectory:

set_base_directory(path: "/path/to/your/project")

Cursor AI (Linux/WSL — Cursor running on Windows)

Build inside WSL, then copy mcp.json to your project's .cursor/ directory:

{
  "mcpServers": {
    "FileScopeMCP": {
      "command": "wsl",
      "args": ["-d", "Ubuntu-24.04", "/home/yourname/FileScopeMCP/run.sh", "--base-dir=${projectRoot}"],
      "transport": "stdio",
      "disabled": false,
      "alwaysAllow": []
    }
  }
}

Cursor AI (Windows native)

{
  "mcpServers": {
    "FileScopeMCP": {
      "command": "node",
      "args": ["C:\\FileScopeMCP\\dist\\mcp-server.js", "--base-dir=${projectRoot}"],
      "transport": "stdio",
      "disabled": false,
      "alwaysAllow": []
    }
  }
}

Cursor AI (macOS / Linux native)

{
  "mcpServers": {
    "FileScopeMCP": {
      "command": "node",
      "args": ["/path/to/FileScopeMCP/dist/mcp-server.js", "--base-dir=${projectRoot}"],
      "transport": "stdio"
    }
  }
}

Daemon Mode

To run FileScopeMCP as a standalone backgr

View on GitHub
GitHub Stars286
CategoryDevelopment
Updated3d ago
Forks33

Languages

HTML

Security Score

80/100

Audited on Mar 27, 2026

No findings