Roundtable
Zero-configuration MCP server that unifies multiple AI coding assistants (Codex, Claude Code, Cursor, Gemini) through intelligent auto-discovery and standardized interface
Install / Use
/learn @askbudi/RoundtableQuality Score
Category
Development & EngineeringSupported Platforms
README
Roundtable AI MCP Server
Stop copy-pasting between AI models. Roundtable AI is a local MCP server that lets your primary AI assistant delegate tasks to specialized models like Gemini, Claude, Codex, and Cursor. Solve complex engineering problems in parallel, directly from your IDE.
Key Features:
- Context Continuity: Shared project context across all sub-agents
- Parallel Execution: All agents work simultaneously
- Model Specialization: Right AI for each task (Gemini's 1M context, Claude's reasoning, Codex's implementation)
- Zero Markup: Uses your existing CLI tools and API subscriptions
- 26+ IDE Support: Works with Claude Code, Cursor, VS Code, JetBrains, and more
Table of Contents
- Quick Start
- What is Roundtable AI
- Technical Architecture
- Why Multi-Agent vs Single AI
- Real-World Examples
- Installation
- IDE Integration
- Advanced Configuration
- Contributing
- License
Quick Start
# Install Roundtable AI
pip install roundtable-ai
# Check available AI tools
roundtable-ai --check
# Start with all available tools
roundtable-ai
# Use specific assistants only
roundtable-ai --agents codex,claude
One-liner for Claude Code:
claude mcp add roundtable-ai -- roundtable-ai --agents gemini,claude,codex,cursor
Try this multi-agent prompt in your IDE:
The user dashboard is randomly slow for enterprise customers.
Use Gemini SubAgent to analyze frontend performance issues in the React components, especially expensive re-renders and inefficient data fetching.
Use Codex SubAgent to examine the backend API endpoint for N+1 queries and database bottlenecks.
Use Claude SubAgent to review the infrastructure logs and identify memory/CPU pressure during peak hours.
What is Roundtable AI
Roundtable AI is a local Model Context Protocol (MCP) server that coordinates specialized AI sub-agents to solve complex engineering problems. Instead of manually switching between different AI tools, you delegate tasks from a single prompt in your IDE, and Roundtable manages the coordination, context sharing, and response synthesis.
Key Benefits
- Context Continuity: The primary agent provides shared, rich context to all sub-agents
- Parallel Execution: All agents work simultaneously, drastically reducing wait time
- Model Specialization: Use the right AI for each task - Gemini's 1M context for analysis, Claude's reasoning for logic, Codex for implementation
- No Extra Cost: Uses your existing CLI tools and API subscriptions with zero markup
- Single Interface: One prompt, multiple specialized responses, automatically synthesized
Technical Architecture
+----------------------------------+
| Your IDE (VS Code, Cursor, etc.) |
| (Primary AI Assistant) |
+----------------+-----------------+
|
(1. User prompt with subagent delegation)
|
+----------------v-----------------+
| Roundtable MCP Server |
| (localhost) |
+----------------+-----------------+
|
(2. Dispatches tasks to sub-agent CLIs in parallel)
|
+--------------------v--------------------+
| |
| +-----------+ +-----------+ +-----------+ |
| | Gemini | | Claude | | Codex | |
| | (Analysis)| | (Logic) | | (Implement)| |
| +-----------+ +-----------+ +-----------+ |
| |
+--------------------^--------------------+
|
(3. Sub-agents execute using local tools,
e.g., read_file, run_shell_command)
|
+----------------+-----------------+
| Roundtable MCP Server |
| (Aggregates & Synthesizes) |
+----------------+-----------------+
|
(4. Returns a single, synthesized response)
|
+----------------v-----------------+
| Your IDE (Primary AI Assistant) |
+----------------------------------+
How It Works
-
Context Continuity: The initial prompt and relevant file/project context are packaged by the primary agent. The MCP server passes this "context bundle" to each sub-agent, ensuring all participants have the same ground truth without manual copy-pasting.
-
Model Specialization: Use the right model for the job. Leverage Gemini's 1M context for codebase analysis, Claude's reasoning for logic and implementation, and Codex's proficiency for code generation and reviews, all in one workflow.
-
No Extra Cost: Roundtable invokes the CLI tools you already have installed and configured. It uses your existing API keys and subscriptions. We add no markup. The cost is exactly what you would pay running the tools manually.
Why Multi-Agent vs Single AI
Because manual context-switching is slow, error-prone, and prevents deep analysis.
The Multi-Tab Workflow ❌
- Manually copy-paste code and context between different AI chats
- Each agent starts fresh, unaware of other conversations or files
- You wait for one agent to finish before starting the next
- You are responsible for merging disparate, often conflicting, advice
- High risk of pasting outdated code or incorrect context
The Roundtable Workflow ✅
- Delegate tasks from a single prompt in your IDE
- The primary agent provides shared, rich context to all sub-agents
- All agents work in parallel, drastically reducing wait time
- The final output automatically synthesizes the best insights from each model
- The entire workflow is a single, deterministic, and repeatable command
Real-World Examples
Each example includes real code, logs, and explicit delegation to specialized sub-agents. Copy the whole block and paste it into your IDE assistant.
- Multi-Stack Debugging — Virtual War Room for Production Issues
I'm debugging a critical production issue. The user sees a "Failed to load data" message.
Here is the browser console output:
```json
{
"timestamp": "2024-09-24T10:05:21.123Z",
"level": "error",
"message": "API request failed for /api/v1/user/profile",
"error": {
"status": 500,
"statusText": "Internal Server Error"
}
}
Here is the backend server log:
ERROR: Exception in ASGI application
File "/app/services/user_service.py", line 42, in get_user_profile
user_data = await db.fetch_one(query)
ValueError: Database connection is not available
Use Gemini SubAgent to analyze the logs from both stacks, correlate the events, and form a hypothesis about the root cause. Use Codex SubAgent to analyze the Python backend traceback and suggest a specific code fix for the database connection error. Use Claude SubAgent to review the frontend error handling and recommend more resilient patterns. Use Cursor SubAgent to search the codebase for other files that might have similar database connection issues.
At the end, aggregate all findings into a single incident report with root cause analysis and prioritized fixes.
2) Performance Optimization — API Latency & Database Query Tuning
```markdown
Our checkout API p95 latency increased from 220ms to 780ms. Need optimization strategy.
PostgreSQL slow query log:
```sql
-- Duration: 2455.112 ms
SELECT c.name, COUNT(o.id) AS total_orders, SUM(p.amount) AS revenue
FROM companies c, orders o, payments p
WHERE c.id = o.company_id
AND o.id = p.order_id
AND c.region = 'North America'
GROUP BY c.name
ORDER BY revenue DESC;
EXPLAIN ANALYZE shows:
Seq Scan on orders (cost=0.00..52000.00 rows=100000)
Filter: (status = 'completed')
Rows Removed by Filter: 134,201
Node.js hotspot from profiling:
// 40% CPU time
orders.map(o => ({ ...o, json: JSON.stringify(o) }));
// N+1 query problem
for (const id of orderIds) {
await fetchInventory(id);
}
Use Claude SubAgent to analyze the EXPLAIN plan and identify why the query is slow. Use Codex SubAgent to rewrite the SQL with proper JOINs and suggest indexes. Use Gemini SubAgent to fix the N+1 query problem with batch fetching. Use Cursor SubAgent to find all instances of JSON.stringify in hot code paths.
Aggregate findings into a performance optimization plan with measurable improvements.
## Installation
### Using pip (Standard)
```bash
pip install roundtable-ai
Using UV/UVX (Recommended for faster installs)
uvx roundtable-ai@latest
IDE Integration
Roundtable AI supports 26+ MCP-compatible clients. Here are the top 7:
1. Claude Code
Using pip:
claude mcp add roundtable-ai -- roundtable-ai --agents gemini,claude,codex,cursor
Using UVX:
claude mcp add roundtable-ai -- uvx roundtable-ai@latest --agents gemini,claude,codex,cursor
2. Cursor
One-Click Install:
[](cursor://anysphere.cursor-deeplink/mcp/install?name=roundtable-ai&config=eyJ0eXBlIjoic3RkaW8iLCJjb21tYW5kIjoidXZ4IiwiYXJncyI6WyJy
Related Skills
node-connect
351.2kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
110.6kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
351.2kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
351.2kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
