SkillAgentSearch skills...

Opencode

A powerful AI coding agent. Built for the terminal.

Install / Use

/learn @opencode-ai/Opencode
About this skill

Quality Score

0/100

Supported Platforms

Claude Code
Claude Desktop

README

Archived: Project has Moved

This repository is no longer maintained and has been archived for provenance.

The project has continued under the name Crush, developed by the original author and the Charm team.

Please follow Crush for ongoing development.

⌬ OpenCode

<p align="center"><img src="https://github.com/user-attachments/assets/9ae61ef6-70e5-4876-bc45-5bcb4e52c714" width="800"></p>

⚠️ Early Development Notice: This project is in early development and is not yet ready for production use. Features may change, break, or be incomplete. Use at your own risk.

A powerful terminal-based AI assistant for developers, providing intelligent coding assistance directly in your terminal.

Overview

OpenCode is a Go-based CLI application that brings AI assistance to your terminal. It provides a TUI (Terminal User Interface) for interacting with various AI models to help with coding tasks, debugging, and more.

<p>For a quick video overview, check out <a href="https://www.youtube.com/watch?v=P8luPmEa1QI"><img width="25" src="https://upload.wikimedia.org/wikipedia/commons/0/09/YouTube_full-color_icon_%282017%29.svg"> OpenCode + Gemini 2.5 Pro: BYE Claude Code! I'm SWITCHING To the FASTEST AI Coder!</a></p>

<a href="https://www.youtube.com/watch?v=P8luPmEa1QI"><img width="550" src="https://i3.ytimg.com/vi/P8luPmEa1QI/maxresdefault.jpg"></a><p>

Features

  • Interactive TUI: Built with Bubble Tea for a smooth terminal experience
  • Multiple AI Providers: Support for OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Groq, Azure OpenAI, and OpenRouter
  • Session Management: Save and manage multiple conversation sessions
  • Tool Integration: AI can execute commands, search files, and modify code
  • Vim-like Editor: Integrated editor with text input capabilities
  • Persistent Storage: SQLite database for storing conversations and sessions
  • LSP Integration: Language Server Protocol support for code intelligence
  • File Change Tracking: Track and visualize file changes during sessions
  • External Editor Support: Open your preferred editor for composing messages
  • Named Arguments for Custom Commands: Create powerful custom commands with multiple named placeholders

Installation

Using the Install Script

# Install the latest version
curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | bash

# Install a specific version
curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | VERSION=0.1.0 bash

Using Homebrew (macOS and Linux)

brew install opencode-ai/tap/opencode

Using AUR (Arch Linux)

# Using yay
yay -S opencode-ai-bin

# Using paru
paru -S opencode-ai-bin

Using Go

go install github.com/opencode-ai/opencode@latest

Configuration

OpenCode looks for configuration in the following locations:

  • $HOME/.opencode.json
  • $XDG_CONFIG_HOME/opencode/.opencode.json
  • ./.opencode.json (local directory)

Auto Compact Feature

OpenCode includes an auto compact feature that automatically summarizes your conversation when it approaches the model's context window limit. When enabled (default setting), this feature:

  • Monitors token usage during your conversation
  • Automatically triggers summarization when usage reaches 95% of the model's context window
  • Creates a new session with the summary, allowing you to continue your work without losing context
  • Helps prevent "out of context" errors that can occur with long conversations

You can enable or disable this feature in your configuration file:

{
  "autoCompact": true // default is true
}

Environment Variables

You can configure OpenCode using environment variables:

| Environment Variable | Purpose | | -------------------------- | -------------------------------------------------------------------------------- | | ANTHROPIC_API_KEY | For Claude models | | OPENAI_API_KEY | For OpenAI models | | GEMINI_API_KEY | For Google Gemini models | | GITHUB_TOKEN | For Github Copilot models (see Using Github Copilot) | | VERTEXAI_PROJECT | For Google Cloud VertexAI (Gemini) | | VERTEXAI_LOCATION | For Google Cloud VertexAI (Gemini) | | GROQ_API_KEY | For Groq models | | AWS_ACCESS_KEY_ID | For AWS Bedrock (Claude) | | AWS_SECRET_ACCESS_KEY | For AWS Bedrock (Claude) | | AWS_REGION | For AWS Bedrock (Claude) | | AZURE_OPENAI_ENDPOINT | For Azure OpenAI models | | AZURE_OPENAI_API_KEY | For Azure OpenAI models (optional when using Entra ID) | | AZURE_OPENAI_API_VERSION | For Azure OpenAI models | | LOCAL_ENDPOINT | For self-hosted models | | SHELL | Default shell to use (if not specified in config) |

Shell Configuration

OpenCode allows you to configure the shell used by the bash tool. By default, it uses the shell specified in the SHELL environment variable, or falls back to /bin/bash if not set.

You can override this in your configuration file:

{
  "shell": {
    "path": "/bin/zsh",
    "args": ["-l"]
  }
}

This is useful if you want to use a different shell than your default system shell, or if you need to pass specific arguments to the shell.

Configuration File Structure

{
  "data": {
    "directory": ".opencode"
  },
  "providers": {
    "openai": {
      "apiKey": "your-api-key",
      "disabled": false
    },
    "anthropic": {
      "apiKey": "your-api-key",
      "disabled": false
    },
    "copilot": {
      "disabled": false
    },
    "groq": {
      "apiKey": "your-api-key",
      "disabled": false
    },
    "openrouter": {
      "apiKey": "your-api-key",
      "disabled": false
    }
  },
  "agents": {
    "coder": {
      "model": "claude-3.7-sonnet",
      "maxTokens": 5000
    },
    "task": {
      "model": "claude-3.7-sonnet",
      "maxTokens": 5000
    },
    "title": {
      "model": "claude-3.7-sonnet",
      "maxTokens": 80
    }
  },
  "shell": {
    "path": "/bin/bash",
    "args": ["-l"]
  },
  "mcpServers": {
    "example": {
      "type": "stdio",
      "command": "path/to/mcp-server",
      "env": [],
      "args": []
    }
  },
  "lsp": {
    "go": {
      "disabled": false,
      "command": "gopls"
    }
  },
  "debug": false,
  "debugLSP": false,
  "autoCompact": true
}

Supported AI Models

OpenCode supports a variety of AI models from different providers:

OpenAI

  • GPT-4.1 family (gpt-4.1, gpt-4.1-mini, gpt-4.1-nano)
  • GPT-4.5 Preview
  • GPT-4o family (gpt-4o, gpt-4o-mini)
  • O1 family (o1, o1-pro, o1-mini)
  • O3 family (o3, o3-mini)
  • O4 Mini

Anthropic

  • Claude 4 Sonnet
  • Claude 4 Opus
  • Claude 3.5 Sonnet
  • Claude 3.5 Haiku
  • Claude 3.7 Sonnet
  • Claude 3 Haiku
  • Claude 3 Opus

GitHub Copilot

  • GPT-3.5 Turbo
  • GPT-4
  • GPT-4o
  • GPT-4o Mini
  • GPT-4.1
  • Claude 3.5 Sonnet
  • Claude 3.7 Sonnet
  • Claude 3.7 Sonnet Thinking
  • Claude Sonnet 4
  • O1
  • O3 Mini
  • O4 Mini
  • Gemini 2.0 Flash
  • Gemini 2.5 Pro

Google

  • Gemini 2.5
  • Gemini 2.5 Flash
  • Gemini 2.0 Flash
  • Gemini 2.0 Flash Lite

AWS Bedrock

  • Claude 3.7 Sonnet

Groq

  • Llama 4 Maverick (17b-128e-instruct)
  • Llama 4 Scout (17b-16e-instruct)
  • QWEN QWQ-32b
  • Deepseek R1 distill Llama 70b
  • Llama 3.3 70b Versatile

Azure OpenAI

  • GPT-4.1 family (gpt-4.1, gpt-4.1-mini, gpt-4.1-nano)
  • GPT-4.5 Preview
  • GPT-4o family (gpt-4o, gpt-4o-mini)
  • O1 family (o1, o1-mini)
  • O3 family (o3, o3-mini)
  • O4 Mini

Google Cloud VertexAI

  • Gemini 2.5
  • Gemini 2.5 Flash

Usage

# Start OpenCode
opencode

# Start with debug logging
opencode -d

# Start with a specific working directory
opencode -c /path/to/project

Non-interactive Prompt Mode

You can run OpenCode in non-interactive mode by passing a prompt directly as a command-line argument. This is useful for scripting, automation, or when you want a quick answer without launching the full TUI.

# Run a single prompt and print the AI's response to the terminal
opencode -p "Explain the use of context in Go"

# Get response in JSON format
opencode -p "Explain the use of context in Go" -f json

# Run without showing the spinner (useful for scripts)
opencode -p "Explain the use of context in Go" -q

In this mode, OpenCode will process your prompt, print the result to standard output, and then exit. All permissions are auto-approved for the session.

By default, a spinner animation is displayed while the model is processing your query. You can disable this spinner with the -q or --quiet flag, which is particularly useful when running OpenCode from scripts or automated workflows.

Output Formats

OpenCode supports the following output formats in non-interactive mode:

| Format | Description | | ------ | ------------------------------- | | text | Plain text output (default) | | json | Output wrapped in a JSON object |

The output

Related Skills

View on GitHub
GitHub Stars11.6k
CategoryDevelopment
Updated9m ago
Forks1.2k

Languages

Go

Security Score

100/100

Audited on Mar 28, 2026

No findings