Cheatcode
cheatcode is an ai coding agent where you can chat with ai to build, preview, and deploy complete web and mobile apps from start to finish.
Install / Use
/learn @cheatcode-ai/CheatcodeREADME
Cheatcode
Open-source AI coding agent that builds, runs, and ships full-stack applications.
Describe what you want. Cheatcode writes the code, executes it in a sandbox, shows you a live preview, and deploys it -- all from a single chat interface.
Website | Report Bug | Request Feature
</div>What is Cheatcode?
Cheatcode is a production-ready AI coding agent with a chat-based interface. You describe what you want to build, and the agent writes code, runs commands, takes screenshots, searches the web, and iterates -- all inside an isolated sandbox with a live preview of your app.
Key capabilities:
- Build apps through conversation -- Chat with an AI agent that writes, edits, and runs code in real-time
- Live preview -- See your web or mobile (Expo) app update live as the agent works
- Sandboxed execution -- All code runs in isolated Daytona sandboxes, not on your machine
- 100+ LLM models -- Use Gemini, Claude, GPT, Grok, Llama, and more via OpenRouter
- 13 built-in tools -- File editing, shell commands, grep, screenshots, vision analysis, LSP, web search, and more
- One-click deploy -- Ship to Vercel directly from the interface
- Third-party integrations -- Connect GitHub, Slack, Gmail, Notion via Composio MCP
- Bring Your Own Key -- Use your own OpenRouter API key for unlimited usage
- Self-hostable -- Run the entire stack on your own infrastructure with Docker Compose
Architecture
flowchart TD
FE["Frontend\nNext.js 16 · React 19 · Clerk"]
BE["Backend API\nFastAPI · Python 3.11"]
AGENT["Agent Loop"]
INNGEST["Inngest\nDurable Workflows · Agent Runs"]
REDIS["Redis\nPub/Sub · Streaming · Locks"]
DB["Supabase\nPostgreSQL"]
SANDBOX["Daytona Sandboxes\nCode Execution · Live Previews"]
LLM["LLM Providers via LiteLLM\nOpenRouter · OpenAI · Anthropic · Google"]
FE -- "REST + JWT" --> BE
FE -- "SSE (EventSource)" --> BE
BE --> INNGEST
INNGEST --> AGENT
AGENT --> REDIS
BE --> DB
BE --> SANDBOX
AGENT --> LLM
| Component | Technology | Role | |-----------|-----------|------| | Backend API | FastAPI, Python 3.11 | REST endpoints, agent orchestration, LLM calls | | Durable Workflows | Inngest | Agent execution, deployments, webhooks, retryable workflows | | Frontend | Next.js 16, React 19 | Chat UI, auth (Clerk), real-time streaming | | Cache / PubSub | Redis | Response streaming, distributed locks, caching | | Database | Supabase (PostgreSQL) | Persistent storage with Row Level Security | | Sandboxes | Daytona SDK | Isolated code execution with live web & mobile previews |
Agent Tools
The agent has 13 tools it can use autonomously during a conversation:
| Tool | What it does | |------|-------------| | Shell | Execute commands in the sandbox (install deps, run scripts, start servers) | | Files | Read, write, delete, copy, and move files and directories | | Grep | Full-text search and semantic (embedding-based) search across files | | Screenshot | Capture browser screenshots of the running app | | Vision | AI-powered analysis of screenshots for visual debugging | | LSP | Find definitions, references, and hover info via Language Server Protocol | | Web Search | Search the web via Tavily for docs, libraries, and best practices | | Components | Embedding-based component discovery for code reuse | | MCP Wrapper | Dynamic integration with GitHub, Slack, Gmail, Notion via Composio | | Completion | Signal task completion and gracefully stop the agent loop |
Getting Started
Prerequisites
| Requirement | Minimum | Notes | |-------------|---------|-------| | Docker | 24.0+ | With Docker Compose 2.0+ | | RAM | 4 GB | 8 GB recommended | | Disk | 2 GB | For Docker images | | OS | Linux, macOS, Windows (WSL2) | |
For local development without Docker, you also need:
- Node.js 20+
- Python 3.11+
- uv (Python package manager)
Required Accounts
You will need API keys from these services:
| Service | What for | Get it at | |---------|----------|-----------| | Supabase | Database (PostgreSQL) | supabase.com | | OpenRouter (or OpenAI / Anthropic) | LLM provider (at least one) | openrouter.ai | | Daytona | Sandboxed code execution | daytona.io | | Relace | Fast inline code edits | relace.ai |
<details> <summary><strong>Optional integrations</strong></summary>| Service | What for | |---------|----------| | Clerk | Authentication (user sign-in/sign-up) | | Tavily | Web search for the agent | | Vercel | One-click deployment of user projects | | Composio | Third-party app integrations (GitHub, Slack, Gmail, Notion) | | Polar.sh | Billing and subscription management | | Firecrawl | Web scraping | | Langfuse | LLM observability and tracing | | Sentry | Error monitoring |
</details>Quick Start (Docker)
1. Clone and set up environment
git clone https://github.com/cheatcode-ai/cheatcode.git
cd cheatcode
# Copy environment templates
cp backend/.env.example backend/.env
cp frontend/.env.example frontend/.env
You can also run
./scripts/setup.shto check prerequisites and copy env files automatically.
2. Fill in your API keys
Edit backend/.env and frontend/.env with your API keys. At minimum, you need the values listed in Required Accounts.
See backend/.env.example for the full list of optional variables including:
- Clerk (authentication)
- Tavily (web search)
- Firecrawl (web scraping)
- Vercel (deployments)
- Composio (third-party integrations)
- Langfuse (observability)
- Sentry (error tracking)
- Polar (billing)
- Inngest (durable workflows)
3. Start everything
docker compose -f docker-compose.dev.yml up --build
4. Open the app
| Service | URL | |---------|-----| | Frontend | http://localhost:3000 | | Backend API | http://localhost:8000 | | Health check | http://localhost:8000/api/health | | Inngest dashboard | http://localhost:8288 |
Sign in with Clerk, create a project, start a thread, and send your first message.
Local Development (Without Docker)
If you prefer running services directly on your machine:
Backend
cd backend
# Install dependencies
uv sync
# Copy env file if you haven't already
cp .env.example .env
# Edit .env with your API keys
# Start the API server (with hot reload)
uv run uvicorn main:app --reload --host 0.0.0.0 --port 8000
Note: You'll need a Redis instance running locally or via Upstash. Update
REDIS_URLinbackend/.envaccordingly. For local Redis:redis://localhost:6379.
Frontend
cd frontend
# Install dependencies
npm install
# Copy env file if you haven't already
cp .env.example .env
# Edit .env with your keys
# Start the dev server (Turbopack)
npm run dev
The frontend runs at http://localhost:3000.
Docker Compose (Development Mode)
For hot-reload on both backend and frontend with Docker:
docker compose -f docker-compose.dev.yml up
This starts:
- Backend on port 8000 -- uvicorn with
--reload - Frontend on port 3000 -- Next.js dev with Turbopack
- Inngest dev server on port 8288
- Redis on port 6380
Project Structure
cheatcode/
├── backend/ # Python FastAPI backend
│ ├── main.py # App entry point
│ ├── agent/ # Agent runtime
│ │ ├── run.py # Agent execution loop
│ │ ├── api.py # Agent REST endpoints
│ │ ├── schemas.py # Pydantic models
│ │ ├── coding_agent_prompt.py # System prompt (web/app)
│ │ ├── mobile_agent_prompt.py # System prompt (mobile)
│ │ └── tools/ # 13 agent tools
│ ├── agentpress/ # Agent framework
│ │ ├── thread_manager.py # Conversation management
│ │ ├── response_processor.py # LLM response parsing + tool exec
│ │ ├── tool_registry.py # Tool registration
│ │ └── context_manager.py # Token limit management
│ ├── services/ # Service integrations
│ │ ├── llm.py # LiteLLM (multi-provider LLM)
│ │ ├── redis.py # Redis client + pub/sub
│ │ ├── supabase.py # Database client
│ │ ├── billing.py # Billing + plans
│ │ ├── vercel_deploy.py # Vercel deployments
│ │ └── ... # 15+ service modules
│ ├── inngest_functions/ # Durable workflow definitions
│ ├── composio_integration/ # MCP integrations + OAuth
│ ├── sandbox/ # Daytona sandbox API
│ ├── deployments/ # Vercel deployment API
│ ├── utils/
