Drep
Automated code review and documentation improvement tool for Gitea, GitHub, and GitLab. Powered by local LLM via open-agent-sdk.
Install / Use
/learn @slb350/DrepREADME
drep
<img src="docs/images/drep.png" alt="drep logo" width="200" />Documentation & Review Enhancement Platform
Automated code review and documentation improvement tool for Gitea, GitHub, and GitLab. Powered by your choice of LLM backend: local models (LM Studio, Ollama, llama.cpp), AWS Bedrock (Claude 4.5), or Anthropic's Claude API.
</div>v1.1.0: Interactive configuration wizard with guided setup! Full support for Python repositories on all three major git platforms: Gitea, GitHub, and GitLab. Support for additional languages and direct Anthropic API provider coming soon.
Features
Proactive Code Analysis
Unlike reactive tools, drep continuously monitors repositories and automatically:
- Detects bugs, security vulnerabilities, and best practice violations
- Opens issues with detailed findings and suggested fixes
- No manual intervention required
Docstring Intelligence
LLM-powered docstring analysis purpose-built for Python:
- Generates Google-style docstrings for public APIs
- Flags TODOs, placeholders, and low-signal docstrings
- Respects decorators (e.g.,
@property,@classmethod) and skips simple helpers
Automated PR/MR Reviews
Intelligent review workflow for Gitea pull requests:
- Parses diffs into structured hunks
- Generates inline comments tied to added lines
- Produces a high-level summary with approval signal
Flexible LLM Backends
Choose the right LLM backend for your needs:
- Local models: Complete privacy with Ollama, llama.cpp, LM Studio
- AWS Bedrock: Enterprise compliance with Claude 4.5 on AWS ✅ NEW
- Anthropic Direct: Latest Claude models with direct API access (planned)
- OpenAI-compatible: Works with any compatible endpoint
Platform Support & Roadmap
- Available now: Gitea, GitHub, GitLab + Python repositories
- Planned: Additional languages, advanced draft PR workflows
LLM-Powered Analysis
drep includes intelligent code analysis powered by local LLMs via OpenAI-compatible backends (LM Studio, Ollama, open-agent-sdk).
Features
- Code Quality Analysis: Detects bugs, security issues, and best practice violations
- Docstring Generation: Automatically generates Google-style docstrings
- PR Reviews: Context-aware code review comments
- Smart Caching: 80%+ cache hit rate on repeated scans
- Cost Tracking: Monitor token usage and estimated costs
- Circuit Breaker: Graceful degradation when LLM unavailable
- Progress Reporting: Real-time feedback during analysis
Quick Start
Step 1: Install drep
pip install drep-ai
Step 2: Initialize Configuration (Interactive Wizard) 🧙♂️
drep init
The interactive wizard guides you through:
- Config Location: Choose between current directory or user config directory
- Platform Selection: Gitea, GitHub, or GitLab
- Enterprise Servers: Detect and configure GitHub Enterprise, self-hosted GitLab/Gitea
- Repository Patterns: Use wildcards (
owner/*) or specific repos (owner/repo) - LLM Backend: OpenAI-compatible (local), AWS Bedrock, or Anthropic
- Documentation Settings: Enable markdown linting, custom dictionaries
- Advanced Options: Database URL, LLM temperature, rate limits
Example Wizard Flow:
$ drep init
============================================================
Welcome to drep configuration setup!
============================================================
Where should the configuration be created?
1. Current directory (./config.yaml)
Use for project-specific configuration
2. User config directory (/Users/you/Library/Application Support/drep/config.yaml)
Use for system-wide configuration (recommended for pip/brew install)
Choose location (1, 2) [2]: 1
Step 1: Git Platform Configuration
------------------------------------------------------------
Which git platform are you using? (github, gitea, gitlab) [github]: github
GitHub Configuration:
Are you using GitHub Enterprise? [y/N]: n
Repository Configuration:
Examples: 'your-org/*' (all repos), 'owner/repo' (single repo)
Enter repositories (comma-separated) [your-org/*]: slb350/*
Step 2: LLM Configuration
------------------------------------------------------------
Enable LLM-powered code analysis? [Y/n]: y
Choose LLM provider (openai-compatible, bedrock, anthropic) [openai-compatible]: openai-compatible
OpenAI-Compatible Configuration:
API Endpoint [http://localhost:1234/v1]:
Model name [qwen3-30b-a3b]:
Require API key? [y/N]: n
... (more wizard steps) ...
============================================================
✓ Configuration created successfully!
============================================================
Config location: config.yaml
Next steps:
1. Set the GITHUB_TOKEN environment variable:
export GITHUB_TOKEN='your-api-token-here'
2. Validate your configuration:
drep validate
3. Start scanning repositories:
drep scan owner/repo
Manual Configuration (Alternative)
For advanced users who prefer YAML editing:
Option 1: Local Models (LM Studio)
- Install LM Studio: https://lmstudio.ai/
- Download a model (Qwen3-30B-A3B recommended)
- Create
config.yaml:
llm:
enabled: true
endpoint: http://localhost:1234/v1 # LM Studio / OpenAI-compatible API (also works with open-agent-sdk)
model: qwen3-30b-a3b
temperature: 0.2
max_tokens: 8000
# Rate limiting
max_concurrent_global: 5
requests_per_minute: 60
# Caching
cache:
enabled: true
ttl_days: 30
Option 2: AWS Bedrock (Claude 4.5)
- Enable Bedrock model access in AWS Console
- Configure AWS credentials (
aws configureor~/.aws/credentials) - Configure drep:
llm:
enabled: true
provider: bedrock # Required for AWS Bedrock
bedrock:
region: us-east-1
model: anthropic.claude-sonnet-4-5-20250929-v1:0 # Or Haiku 4.5
temperature: 0.2
max_tokens: 4000
# Caching
cache:
enabled: true
ttl_days: 30
See docs/llm-setup.md for detailed setup instructions and troubleshooting.
Run Analysis
drep scan owner/repo --show-progress --show-metrics
View Metrics
# Show detailed usage statistics
drep metrics --detailed
# Export to JSON
drep metrics --export metrics.json
# Last 7 days only
drep metrics --days 7
Example output:
===== LLM Usage Report =====
Session duration: 0h 5m 32s
Total requests: 127 (115 successful, 12 failed, 95 cached)
Success rate: 90.6%
Cache hit rate: 74.8%
Tokens used: 45,230 prompt + 12,560 completion = 57,790 total
Estimated cost: $0.29 USD (or $0 with LM Studio)
Performance:
Average latency: 1250ms
Min/Max: 450ms / 3200ms
By Analyzer:
code_quality: 45 requests (12,345 tokens)
docstring: 38 requests (8,901 tokens)
pr_review: 44 requests (36,544 tokens)
Quick Start
Installation
Via Homebrew (macOS/Linux)
brew tap slb350/drep
brew install drep-ai
Via pip
pip install drep-ai
Note: The PyPI package is named drep-ai (the name drep was already taken). After installation, the command-line tool is still drep.
From source
git clone https://github.com/slb350/drep.git
cd drep
pip install -e ".[dev]"
Via Docker
docker pull ghcr.io/slb350/drep:latest
Configuration
drep supports GitHub, Gitea, and GitLab. The init command will ask which platform you're using and generate the correct configuration.
Step 1: Initialize Configuration
drep init
You'll be prompted to choose where to store the configuration and which platform to use:
Where should the configuration be created?
1. Current directory (./config.yaml)
Use for project-specific configuration
2. User config directory (~/.config/drep/config.yaml)
Use for system-wide configuration (recommended for pip/brew install)
Choose location (1, 2) [2]: 2
Step 1: Platform Configuration
------------------------------------------------------------
Which git platform are you using?
Choose platform (github, gitea, gitlab) [github]: github
✓ Configuration created successfully!
------------------------------------------------------------
Config location: /Users/yourname/.config/drep/config.yaml
Next steps:
1. Set the GITHUB_TOKEN environment variable:
export GITHUB_TOKEN='your-api-token-here'
Config File Discovery: drep automatically finds your config file in this order:
- Explicit
--configpath (if provided) DREP_CONFIGenvironment variable./config.yaml(project-specific)~/.config/drep/config.yaml(user config)
This means you can run drep scan owner/repo without specifying --config - it will automatically find your configuration!
Step 2: Set Your API Token
Create an API token from your platform:
For GitHub:
- Go to Settings → Developer settings → Personal access tokens → Tokens (classic)
- Generate new token with
reposcope - Set the environment variable:
export GITHUB_TOKEN="ghp_your_token_here"
For Gitea:
- Go to Settings → Applications → Generate New Token
- Set the environment variable:
export GITEA_TOKEN="your_token_here"
For GitLab:
- Go to User Settings → Access Tokens
- Create token with
apiscope - Set the environment variable:
export GITLAB_TOKEN="your_token_here"
Step 3: Configure Repositories (Optional)
Edit your config file (location shown in drep init output) to specify which repositories to monitor:
