SkillAgentSearch skills...

Drt

Reverse ETL for the code-first data stack

Install / Use

/learn @drt-hub/Drt

README

English | 日本語

<picture> <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/drt-hub/.github/main/profile/assets/logo-dark.svg"> <img src="https://raw.githubusercontent.com/drt-hub/.github/main/profile/assets/logo.svg" alt="drt logo" width="200"> </picture>

drt — data reverse tool

Reverse ETL for the code-first data stack.

CI PyPI License Python

drt syncs data from your data warehouse to external services — declaratively, via YAML and CLI. Think dbt rundrt run. Same developer experience, opposite data direction.

<p align="center"> <img src="docs/assets/quickstart.gif" alt="drt quickstart demo" width="700"> </p>
pip install drt-core          # core (DuckDB included)
drt init && drt run

Why drt?

| Problem | drt's answer | |---------|-------------| | Census/Hightouch are expensive SaaS | Free, self-hosted OSS | | GUI-first tools don't fit CI/CD | CLI + YAML, Git-native | | dbt/dlt ecosystem has no reverse leg | Same philosophy, same DX | | LLM/MCP era makes GUI SaaS overkill | LLM-native by design |


Quickstart

No cloud accounts needed — runs locally with DuckDB in about 5 minutes.

1. Install

pip install drt-core

For cloud sources: pip install drt-core[bigquery], drt-core[postgres], etc.

2. Set up a project

mkdir my-drt-project && cd my-drt-project
drt init   # select "duckdb" as source

3. Create sample data

python -c "
import duckdb
c = duckdb.connect('warehouse.duckdb')
c.execute('''CREATE TABLE IF NOT EXISTS users AS SELECT * FROM (VALUES
  (1, 'Alice', 'alice@example.com'),
  (2, 'Bob',   'bob@example.com'),
  (3, 'Carol', 'carol@example.com')
) t(id, name, email)''')
c.close()
"

4. Create a sync

# syncs/post_users.yml
name: post_users
description: "POST user records to an API"
model: ref('users')
destination:
  type: rest_api
  url: "https://httpbin.org/post"
  method: POST
  headers:
    Content-Type: "application/json"
  body_template: |
    { "id": {{ row.id }}, "name": "{{ row.name }}", "email": "{{ row.email }}" }
sync:
  mode: full
  batch_size: 1
  on_error: fail

5. Run

drt run --dry-run   # preview, no data sent
drt run             # run for real
drt status          # check results

See examples/ for more: Slack, Google Sheets, HubSpot, GitHub Actions, etc.


CLI Reference

drt init                    # initialize project
drt list                    # list sync definitions
drt run                     # run all syncs
drt run --select <name>     # run a specific sync
drt run --dry-run           # dry run
drt run --verbose           # show row-level error details
drt validate                # validate sync YAML configs
drt status                  # show recent sync status
drt status --verbose        # show per-row error details
drt mcp run                 # start MCP server (requires drt-core[mcp])

MCP Server

Connect drt to Claude, Cursor, or any MCP-compatible client so you can run syncs, check status, and validate configs without leaving your AI environment.

pip install drt-core[mcp]
drt mcp run

Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "drt": {
      "command": "drt",
      "args": ["mcp", "run"]
    }
  }
}

Available MCP tools:

| Tool | What it does | |------|-------------| | drt_list_syncs | List all sync definitions | | drt_run_sync | Run a sync (supports dry_run) | | drt_get_status | Get last run result(s) | | drt_validate | Validate sync YAML configs | | drt_get_schema | Return JSON Schema for config files |


AI Skills for Claude Code

Install the official Claude Code skills to generate YAML, debug failures, and migrate from other tools — all from the chat interface.

Install via Plugin Marketplace (recommended)

/plugin marketplace add drt-hub/drt
/plugin install drt@drt-hub

Tip: Enable auto-update so you always get the latest skills when drt is updated: /plugin → Marketplaces → drt-hub → Enable auto-update

Manual install (slash commands)

Copy the files from .claude/commands/ into your drt project's .claude/commands/ directory.

| Skill | Trigger | What it does | |-------|---------|-------------| | /drt-create-sync | "create a sync" | Generates valid sync YAML from your intent | | /drt-debug | "sync failed" | Diagnoses errors and suggests fixes | | /drt-init | "set up drt" | Guides through project initialization | | /drt-migrate | "migrate from Census" | Converts existing configs to drt YAML |


Connectors

| Type | Name | Status | Install | |------|------|--------|---------| | Source | BigQuery | ✅ v0.1 | pip install drt-core[bigquery] | | Source | DuckDB | ✅ v0.1 | (core) | | Source | PostgreSQL | ✅ v0.1 | pip install drt-core[postgres] | | Source | Snowflake | 🗓 planned | pip install drt-core[snowflake] | | Source | SQLite | ✅ v0.4.2 | (core) | | Source | Redshift | ✅ v0.3.4 | pip install drt-core[redshift] | | Source | ClickHouse | ✅ v0.4.3 | pip install drt-core[clickhouse] | | Source | MySQL | 🗓 planned | pip install drt-core[mysql] | | Destination | REST API | ✅ v0.1 | (core) | | Destination | Slack Incoming Webhook | ✅ v0.1 | (core) | | Destination | Discord Webhook | ✅ v0.4.2 | (core) | | Destination | GitHub Actions (workflow_dispatch) | ✅ v0.1 | (core) | | Destination | HubSpot (Contacts / Deals / Companies) | ✅ v0.1 | (core) | | Destination | Google Sheets | ✅ v0.4 | pip install drt-core[sheets] | | Destination | PostgreSQL (upsert) | ✅ v0.4 | pip install drt-core[postgres] | | Destination | MySQL (upsert) | ✅ v0.4 | pip install drt-core[mysql] | | Destination | CSV / JSON file | 🗓 v0.5 | (core) | | Destination | Salesforce | 🗓 v0.6 | pip install drt-core[salesforce] | | Destination | Notion | 🗓 planned | (core) | | Destination | Linear | 🗓 planned | (core) | | Destination | SendGrid | 🗓 planned | (core) | | Integration | Dagster | ✅ v0.4 | pip install dagster-drt | | Integration | Airflow | 🗓 v0.6 | pip install airflow-drt | | Integration | dbt manifest reader | ✅ v0.4 | (core) |


Roadmap

Detailed plans & progress → GitHub Milestones Looking to contribute? → Good First Issues

| Version | Focus | |---------|-------| | v0.1 ✅ | BigQuery / DuckDB / Postgres sources · REST API / Slack / GitHub Actions / HubSpot destinations · CLI · dry-run | | v0.2 ✅ | Incremental sync (cursor_field watermark) · retry config per-sync | | v0.3 ✅ | MCP Server (drt mcp run) · AI Skills for Claude Code · LLM-readable docs · row-level errors · security hardening · Redshift source | | v0.4 ✅ | Google Sheets / PostgreSQL / MySQL destinations · dagster-drt · dbt manifest reader · type safety overhaul | | v0.5 | Snowflake source · CSV/JSON + Parquet destinations · test coverage · Docker | | v0.6 | Salesforce · Airflow integration · Jira / Twilio / Intercom destinations | | v0.7 | DWH destinations (Snowflake / BigQuery / ClickHouse / Databricks) · Cloud storage (S3 / GCS / Azure Blob) | | v0.8 | Lakehouse sources (Delta Lake / Apache Iceberg) | | v1.x | Rust engine (PyO3) |


Orchestration: dagster-drt

Community-maintained Dagster integration. Expose drt syncs as Dagster assets with full observability.

pip install dagster-drt
from dagster import Definitions
from dagster_drt import drt_assets, DagsterDrtTranslator

class MyTranslator(DagsterDrtTranslator):
    def get_group_name(self, sync_config):
        return "reverse_etl"

defs = Definitions(
    assets=drt_assets(
        project_dir="path/to/drt-project",
        dagster_drt_translator=MyTranslator(),
    )
)

See dagster-drt README for full API docs (Translator, DrtConfig dry-run, MaterializeResult).


Ecosystem

drt is designed to work alongside, not against, the modern data stack:

<p align="center"> <img src="docs/assets/ecosystem.png" alt="drt ecosystem — dlt load, dbt transform, drt activate" width="700"> </p>

Contributing

See CONTRIBUTING.md.

Disclaimer

drt is an independent open-source project and is not affiliated with, endorsed by, or sponsored by dbt Labs, dlt-hub, or any other company.

"dbt" is a registered trademark of dbt Labs, Inc. "dlt" is a project maintained by dlt-hub.

drt is designed to complement these tools as part of the modern data stack, but is a separate project with its own codebase and maintainers.

License

Apache 2.0 — see LICENSE.

Related Skills

claude-opus-4-5-migration

107.8k

Migrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5

Hook Development

107.8k

This skill should be used when the user asks to "create a hook", "add a PreToolUse/PostToolUse/Stop hook", "validate tool use", "implement prompt-based hooks", "use ${CLAUDE_PLUGIN_ROOT}", "set up event-driven automation", "block dangerous commands", or mentions hook events (PreToolUse, PostToolUse, Stop, SubagentStop, SessionStart, SessionEnd, UserPromptSubmit, PreCompact, Notification). Provides comprehensive guidance for creating and implementing Claude Code plugin hooks with focus on advanced prompt-based hooks API.

MCP Integration

107.8k

This skill should be used when the user asks to "add MCP server", "integrate MCP", "configure MCP in plugin", "use .mcp.json", "set up Model Context Protocol", "connect external service", mentions "${CLAUDE_PLUGIN_ROOT} with MCP", or discusses MCP server types (SSE, stdio, HTTP, WebSocket). Provides comprehensive guidance for integrating Model Context Protocol servers into Claude Code plugins for external tool and service integration.

Plugin Structure

107.8k

This skill should be used when the user asks to "create a plugin", "scaffold a plugin", "understand plugin structure", "organize plugin components", "set up plugin.json", "use ${CLAUDE_PLUGIN_ROOT}", "add commands/agents/skills/hooks", "configure auto-discovery", or needs guidance on plugin directory layout, manifest configuration, component organization, file naming conventions, or Claude Code plugin architecture best practices.

View on GitHub
GitHub Stars8
CategoryData
Updated17h ago
Forks12

Languages

Python

Security Score

90/100

Audited on Apr 2, 2026

No findings