SkillAgentSearch skills...

OpenDeepWiki

OpenDeepWiki is the open-source version of the DeepWiki project, aiming to provide a powerful knowledge management and collaboration platform. The project is mainly developed using C# and TypeScript, supporting modular design, and is easy to expand and customize.

Install / Use

/learn @AIDotNet/OpenDeepWiki
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

OpenDeepWiki

中文 | English

<div align="center"> <img src="/img/favicon.png" alt="OpenDeepWiki Logo" width="220" /> <h3>AI-Driven Code Knowledge Base</h3> </div>

enterprise service

Pricing of enterprise services

Our enterprise service offers comprehensive support and flexibility for businesses seeking professional AI solutions.


Features

  • Quick Conversion: Supports converting all GitHub, GitLab, AtomGit, Gitee, Gitea and other code repositories into knowledge bases within minutes.
  • Multi-language Support: Supports code analysis and documentation generation for all programming languages.
  • Code Structure Diagrams: Automatically generates Mermaid diagrams to help understand code structure.
  • Custom Model Support: Supports custom models and custom APIs for flexible extension.
  • AI Intelligent Analysis: AI-based code analysis and code relationship understanding.
  • SEO Friendly: Generates SEO-friendly documentation and knowledge bases based on Next.js for easy search engine crawling.
  • Conversational Interaction: Supports conversations with AI to obtain detailed code information and usage methods for deep code understanding.

Feature List

  • [x] Support multiple code repositories (GitHub, GitLab, AtomGit, Gitee, Gitea, etc.)
  • [x] Support multiple programming languages (Python, Java, C#, JavaScript, etc.)
  • [x] Support repository management (CRUD operations on repositories)
  • [x] Support multiple AI providers (OpenAI, AzureOpenAI, Anthropic, etc.)
  • [x] Support multiple databases (SQLite, PostgreSQL, SqlServer, MySQL, etc.)
  • [x] Support multiple languages (Chinese, English, French, etc.)
  • [x] Support uploading ZIP files and local files
  • [x] Provide data fine-tuning platform to generate fine-tuning datasets
  • [x] Support directory-level repository management with dynamic directory and document generation
  • [x] Support repository directory modification management
  • [x] Support user management (CRUD operations on users)
  • [x] Support user permission management
  • [x] Support repository-level generation of different fine-tuning framework datasets

Project Introduction

OpenDeepWiki is an open-source project inspired by DeepWiki, developed based on .NET 9 and Semantic Kernel. It aims to help developers better understand and utilize code repositories, providing features such as code analysis, documentation generation, and knowledge graph construction.

Main Features:

  • Analyze code structure
  • Understand repository core concepts
  • Generate code documentation
  • Automatically generate README.md for code
  • Support MCP (Model Context Protocol)

MCP Support

OpenDeepWiki supports the MCP protocol:

  • Can serve as a single repository MCPServer for repository analysis.

Example configuration:

{
  "mcpServers": {
    "OpenDeepWiki":{
      "url": "http://Your OpenDeepWiki service IP:port/api/mcp?owner=AIDotNet&name=OpenDeepWiki"
    }
  }
}

If mcp streamable http is not supported, use the following format:


{
  "mcpServers": {
    "OpenDeepWiki":{
      "url": "http://Your OpenDeepWiki service IP:port/api/mcp/sse?owner=AIDotNet&name=OpenDeepWiki"
    }
  }
}

MCP Streamable Configuration

You can configure MCP streamable support for specific services using the MCP_STREAMABLE environment variable:

environment:
  # Format: serviceName1=streamableUrl1,serviceName2=streamableUrl2
  - MCP_STREAMABLE=claude=http://localhost:8080/api/mcp,windsurf=http://localhost:8080/api/mcp

This allows you to specify which services should use streamable HTTP endpoints and their corresponding URLs.

  • owner: Repository organization or owner name
  • name: Repository name

After adding the repository, you can test by asking questions like "What is OpenDeepWiki?", with effects as shown below:

This way, OpenDeepWiki can serve as an MCPServer for other AI models to call, facilitating analysis and understanding of open-source projects.


🚀 Quick Start

  1. Clone the repository
git clone https://github.com/AIDotNet/OpenDeepWiki.git
cd OpenDeepWiki
  1. Modify environment variable configuration in docker-compose.yml:
  • OpenAI example:
services:
  koalawiki:
    environment:
      - TASK_MAX_SIZE_PER_USER=2 # Maximum parallel document generation tasks per user for AI
      - CHAT_MODEL=DeepSeek-V3 # Model must support function calling
      - ANALYSIS_MODEL= # Analysis model for generating repository directory structure
      - CHAT_API_KEY= # Your API Key
      - LANGUAGE= # Default generation language, e.g., "Chinese"
      - ENDPOINT=https://api.token-ai.cn/v1
      - DB_TYPE=sqlite
      - MODEL_PROVIDER=OpenAI # Model provider, supports OpenAI, AzureOpenAI
      - DB_CONNECTION_STRING=Data Source=/data/KoalaWiki.db
      - EnableSmartFilter=true # Whether to enable smart filtering, affects AI's ability to get repository file directories
      - UPDATE_INTERVAL=5 # Repository incremental update interval in days
      - MAX_FILE_LIMIT=100 # Maximum upload file limit in MB
      - DEEP_RESEARCH_MODEL= # Deep research model, if empty uses CHAT_MODEL
      - ENABLE_INCREMENTAL_UPDATE=true # Whether to enable incremental updates
      - ENABLE_CODED_DEPENDENCY_ANALYSIS=false # Whether to enable code dependency analysis, may affect code quality
      - ENABLE_WAREHOUSE_COMMIT=true # Whether to enable warehouse commit
      - ENABLE_FILE_COMMIT=true # Whether to enable file commit
      - REFINE_AND_ENHANCE_QUALITY=false # Whether to refine and enhance quality
      - CATALOGUE_FORMAT=compact # Directory structure format (compact, json, pathlist, unix)
      - CUSTOM_BODY_PARAMS= # Custom request body parameters, format: key1=value1,key2=value2 (e.g., stop=<|im_end|>,max_tokens=4096)
      - READ_MAX_TOKENS=100000 # The maximum token limit for reading files in AI is set to prevent unlimited file reading. It is recommended to fill in 70% of the model's maximum token.
      - MCP_STREAMABLE= # MCP service streamable configuration, format: serviceName=streamableUrl (e.g., claude=http://localhost:8080/api/mcp,windsurf=http://localhost:8080/api/mcp)
      # Auto Context Compression configuration (optional)
      - AUTO_CONTEXT_COMPRESS_ENABLED=false # Whether to enable AI-powered intelligent context compression
      - AUTO_CONTEXT_COMPRESS_TOKEN_LIMIT=100000 # Token limit to trigger compression (required when enabled)
      - AUTO_CONTEXT_COMPRESS_MAX_TOKEN_LIMIT=200000 # Maximum allowed token limit (default: 200000)
      # Feishu Bot configuration (optional)
      - FeishuAppId=
      - FeishuAppSecret=
      - FeishuBotName=KoalaWiki
  • AzureOpenAI and Anthropic configurations are similar, only need to adjust ENDPOINT and MODEL_PROVIDER.

Database Configuration

SQLite (Default)

- DB_TYPE=sqlite
- DB_CONNECTION_STRING=Data Source=/data/KoalaWiki.db

PostgreSQL

- DB_TYPE=postgres
- DB_CONNECTION_STRING=Host=localhost;Database=KoalaWiki;Username=postgres;Password=password

SQL Server

- DB_TYPE=sqlserver
- DB_CONNECTION_STRING=Server=localhost;Database=KoalaWiki;Trusted_Connection=true;

MySQL

- DB_TYPE=mysql
- DB_CONNECTION_STRING=Server=localhost;Database=KoalaWiki;Uid=root;Pwd=password;
  1. Start services

Using Makefile commands:

# Build all Docker images
make build

# Start all services in background
make up

# Start in development mode (with visible logs)
make dev

Visit http://localhost:8090 to access the knowledge base page.

For Windows users without make environment, use Docker Compose directly:

docker-compose build
docker-compose up -d
docker-compose up
docker-compose down
docker-compose logs -f

Deployment Recommendations

  • Build for specific architecture:
docker-compose build --build-arg ARCH=arm64
docker-compose build --build-arg ARCH=amd64
  • Build only backend or frontend:
docker-compose build koalawiki
docker-compose build koalawiki-web
  • One-click deployment to Sealos (supports public network access):

Deploy on Sealos

For detailed steps, please refer to: One-click Sealos Deployment of OpenDeepWiki


🔍 How It Works

OpenDeepWiki leverages AI to achieve:

  • Clone code repository locally
  • Read .gitignore configuration to ignore irrelevant files
  • Recursively scan directories to get all files and directories
  • Determine if file count exceeds threshold; if so, call AI model for intelligent directory filtering
  • Parse AI-returned directory JSON data
  • Generate or update README.md
  • Call AI model to generate repository classification information and project overview
  • Clean project analysis tag content and save project overview to database
  • Call AI to generate thinking directory (task list)
  • Recursively process directory tasks to generate document directory structure
  • Save directory structure to database
  • Process incomplete document tasks
  • If Git repository, clean old commit records, call AI to generate update log and save

OpenDeepWiki Repository Parsing to Documentation Detailed Flow Chart

graph TD
    A[Clone code repository] --> B[Read .gitignore configuration to ignore files]
    B --> C[Recursively scan directories to get all files and directories]
    C --> D{Does file count exceed threshold?}
    D -- No --> E[Directly return directory structure]
    D -- Yes --> F[Call AI model for intelligent directory structure filtering]
    F --> G[Parse AI-returned directory JSON data]
    E --> G
    G --> H[Generate or update README.md]
    H --> I[Call AI model to generate repository classification information]
    I --> J[C

Related Skills

View on GitHub
GitHub Stars3.0k
CategoryCustomer
Updated3h ago
Forks391

Languages

C#

Security Score

100/100

Audited on Mar 28, 2026

No findings