Kiwiq
Production-grade multi-agent orchestration platform - JSON-defined agents, multi-tier memory, and built-in observability. Battle-tested on 200+ enterprise AI agents. Now fully open-sourced (prod at https://kiwiq.ai).
Install / Use
/learn @rcortx/KiwiqREADME
KiwiQ AI Platform
Production-grade multi-agent orchestration platform — JSON-defined agents, multi-tier memory, and built-in observability.
Battle-tested on 200+ enterprise AI agents. Define complex multi-step AI workflows as Python/JSON graph schemas with 24+ reusable node types, multi-provider LLM support, human-in-the-loop interactions, web scraping, RAG pipelines, and versioned customer data management. Ships with 27+ ready-to-use workflow definitions.
This platform powered KiwiQ AI (Marketing AI Agents) in production and is now fully open-sourced.
Features
- SDK-first workflow engine — Define workflows as Python/JSON graph schemas (not a visual builder), compiled to LangGraph and executed via Prefect
- Multi-provider LLM support — OpenAI, Anthropic, Google Gemini, Perplexity, Fireworks, AWS Bedrock
- Human-in-the-Loop (HITL) — Pause workflows for human review, input, or approval with real-time WebSocket streaming
- 24+ reusable node types — LLM, routing, conditional branching, data transforms, scraping, code execution, sub-workflows, and more
- 27+ production workflow definitions — Content creation, diagnostics, lead scoring, deep research, playbook generation, and more included out of the box
- Multi-tier memory & state — PostgreSQL for relational state, MongoDB for versioned documents, Weaviate for vector search, Redis for caching
- RAG pipelines — Document ingestion, vector search (Weaviate), and retrieval-augmented generation
- Customer data management — Versioned document storage in MongoDB with CRUD workflow nodes
- Event-driven architecture — RabbitMQ-based event bus for async processing between services
- Observability — Real-time progress via RabbitMQ events, WebSocket streaming, Prefect dashboard, structured logging
- Web scraping — LinkedIn profiles/companies, web crawling, AI-powered search engines
- Sandboxed code execution — Run user-defined Python safely within workflows
- Billing & auth — Stripe integration, JWT authentication, role-based access control
Architecture Overview
┌─────────────────────────────────────────────────────────────┐
│ FastAPI (kiwi_app) │
│ Auth │ Billing │ Workflow API │ RAG │ Data Jobs │ WebSocket │
└──────────────────────────┬──────────────────────────────────┘
│
┌────────────┼────────────┐
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│ RabbitMQ │ │ Prefect │ │ Redis │
│ (events) │ │ (orch) │ │ (cache) │
└────┬─────┘ └────┬─────┘ └──────────┘
│ │
▼ ▼
┌──────────────────────────┐
│ Workflow Service │
│ ┌────────┐ ┌────────┐ │
│ │LangGraph│ │ Nodes │ │
│ │ Engine │ │Registry│ │
│ └────────┘ └────────┘ │
└──────────┬───────────────┘
│
┌─────────┬───┼───┬──────────┐
▼ ▼ ▼ ▼ ▼
┌────────┐┌───────┐┌────────┐┌──────────┐
│Postgres││MongoDB││Weaviate ││LLM APIs │
│(state) ││(docs) ││(vector) ││(OpenAI…) │
└────────┘└───────┘└────────┘└──────────┘
Quick Start
Prerequisites
1. Clone and install
git clone https://github.com/kiwiq-ai/kiwiq-oss.git
cd kiwiq-oss
poetry install
2. Configure environment
cp .env.sample .env
Edit .env and fill in required values:
| Variable | Description |
|----------|-------------|
| OPENAI_API_KEY | OpenAI API key (required for LLM nodes) |
| ANTHROPIC_API_KEY | Anthropic API key (optional) |
| GOOGLE_API_KEY | Google Gemini API key (optional) |
| PPLX_API_KEY | Perplexity API key (optional, for deep research workflows) |
| POSTGRES_* | PostgreSQL credentials |
| MONGO_ROOT_* | MongoDB credentials |
| RABBITMQ_DEFAULT_* | RabbitMQ credentials |
| REDIS_PASSWORD | Redis password |
| SECRET_KEY | JWT secret — generate with openssl rand -hex 32 |
See .env.sample for the full list of configuration options.
3. Start services
# Development (all services including databases)
docker compose -f docker-compose-dev.yml up --build
4. Access the platform
- API docs: http://localhost:8000/docs
- Prefect dashboard: http://localhost:4201
- RabbitMQ management: http://localhost:15672
- RedisInsight: http://localhost:8001
Using with Claude Code
This repository ships with a comprehensive CLAUDE.md that provides Claude Code with full context about the project — architecture, commands, testing patterns, workflow development, and troubleshooting. To get started:
- Install Claude Code
- Run
claudefrom the repo root
Claude Code will automatically pick up CLAUDE.md and can help with local setup, running tests, modifying services, building workflows, debugging, and navigating the codebase.
Docker Environments
Development (docker-compose-dev.yml)
Full local environment with all services containerized:
| Service | Port | Description | |---------|------|-------------| | FastAPI App | 8000 | Core API server with live reload | | PostgreSQL | 5432 | Relational data, workflow state, LangGraph checkpoints | | MongoDB | 27017 | Customer data, workflow configs, prompt templates | | Redis | 6379 | Caching and session management | | RabbitMQ | 5672 / 15672 | Event streaming / management UI | | Weaviate | 8080 | Vector database for RAG | | Prefect Server | 4201 | Workflow orchestration dashboard | | Prefect Agent | — | Workflow execution worker |
All data is persisted via named Docker volumes. Code changes are hot-reloaded via volume mounts.
Production (docker-compose.prod.yml)
Production-hardened environment with:
- Nginx + Certbot for SSL/TLS termination
- External managed databases (PostgreSQL, MongoDB) — not containerized
- Resource limits on all containers (CPU and memory caps)
- JSON-file logging with rotation
- Health checks on critical services
- Tuned for 4x concurrent workflow execution
Running Without Docker
To run services directly on your host:
# Set Python path
export PYTHONPATH=$(pwd):$(pwd)/services
# Start FastAPI server
poetry run uvicorn kiwi_app.main:app --host 0.0.0.0 --port 8000 --reload
# Start Prefect worker (separate terminal)
poetry run python services/workflow_service/services/worker.py
You'll need PostgreSQL, MongoDB, Redis, RabbitMQ, and Weaviate running separately (locally or hosted).
Repository Structure
kiwiq-oss/
├── libs/src/ # Shared libraries
│ ├── db/ # PostgreSQL (SQLModel, Alembic migrations)
│ ├── mongo_client/ # MongoDB client
│ ├── redis_client/ # Redis client
│ ├── rabbitmq_client/ # RabbitMQ event streaming
│ ├── weaviate_client/ # Weaviate vector DB client
│ ├── global_config/ # Global settings
│ └── global_utils/ # Shared utilities
│
├── services/
│ ├── kiwi_app/ # Core FastAPI application
│ │ ├── auth/ # Authentication & authorization
│ │ ├── billing/ # Stripe billing & credits
│ │ ├── workflow_app/ # Workflow API, WebSockets, events
│ │ ├── data_jobs/ # Data ingestion & RAG pipelines
│ │ └── rag_service/ # RAG query endpoints
│ │
│ ├── workflow_service/ # Workflow engine
│ │ ├── registry/nodes/ # 24+ node implementations
│ │ ├── graph/ # GraphSchema → LangGraph builder
│ │ └── services/worker.py # Prefect worker entrypoint
│ │
│ ├── linkedin_integration/ # LinkedIn OAuth & API
│ └── scraper_service/ # Web scraping endpoints
│
├── standalone_test_client/ # Workflow SDK, 27+ workflow definitions
│ └── kiwi_client/
│ └── workflows/active/ # Production workflow examples
│
├── tests/ # 75+ unit & integration tests
├── docs/ # 40+ pages of technical documentation
├── docker/ # Dockerfiles & setup scripts
└── pyproject.toml # Poetry dependencies
Node Types
Workflows are composed from reusable nodes defined in Python/JSON. Key categories:
| Category | Nodes | Description |
|----------|-------|-------------|
| Core | input_node, output_node, router_node, map_list_router_node, if_else_condition, transform_data | Flow control, routing, data transforms |
| LLM | llm, prompt_constructor, prompt_template_loader | Multi-provider LLM execution, prompt building |
| Data | load_customer_data, store_customer_data, delete_customer_data, merge_aggregate | MongoDB CRUD, data joins |
| Scraping | linkedin_scraping, crawler_scraper, ai_answer_engine_scraper | LinkedIn, web crawling, AI search |
| Advanced | tool_executor, code_runner, workflow_runner, hitl_node | Tool use, sandboxed code, sub-workflows, human review |
Included Workflows
The standalone_test_client/ ships with 27+ production workflow definitions — complete with graph schemas, LLM prompts, HITL test inputs, and runner scripts. These serve as both reference implementations and a starting point for building your own.
