BotBrain
Modular open-source brain for legged robots. Web UI for teleops, autonomous navigation, mapping & monitoring. 3D-printable hardware that runs on ROS2.
Install / Use
/learn @botbotrobotics/BotBrainREADME
BotBrain Open Source (BBOSS) <img src="docs/images/bot_eyes.png" alt="🤖" width="50" style="vertical-align: middle;">
BotBrain is a modular collection of open source software and hardware components that lets you drive, see, map, navigate (manually or autonomously), monitor, and manage legged (quadrupeds, bipeds and humanoids) or wheeled ROS2 robots from a simple but powerful web UI. The hardware gives you 3D printable mounts and an outer case so you can put BotBrain on your robot without guesswork.
- Designed around Intel RealSense D435i and the NVIDIA Jetson line
- Officially supported boards: Jetson Nano, Jetson Orin Nano (support for AGX and Thor coming soon)
- Everything is modular - you don't need to run every module (some heavy AI modules require Orin AGX)
Compleate features list
Multi-Robot Platform Support
- Unitree Go2 & Go2-W - Quadruped robots with full hardware interface and control
- Unitree G1 - Humanoid with upper-body pose control and FSM transitions
- DirectDrive Tita - Biped with full control
- Custom robots - Extensible framework for adding any ROS2-compatible platform
- Legged & wheeled - Architecture supports both locomotion types
Hardware & Sensors
- 3D printable enclosure - Snap-fit design with robot-specific mounting adapters (Go2, G1, and Direct drive Tita)
- Intel RealSense D435i - Dual camera support for viewing and SLAM/Navigation
- IMU & odometry - Real-time pose estimation from all supported platforms
- Battery monitoring - Per-robot battery state with runtime estimation
AI & Perception (Coming Soon)
- YOLOv8/v11 object detection - 80+ classes, TensorRT-optimized, real-time tracking on BotBrain (Coming Soon)
- ROSA natural language control - Conversational robot commands via LLM
- Detection history - Searchable log with image and information / description (Coming Soon)
Autonomous Navigation
- RTABMap SLAM - Visual mapping with single or dual RealSense D435i cameras
- Nav2 integration - Path planning, dynamic obstacle avoidance, recovery behaviors
- Mission planning - Create and execute multi-waypoint autonomous patrols
- Click-to-navigate - Set goals directly on the map interface
- Map management - Save, load, switch, and set home positions
System Orchestration
- Lifecycle management - Coordinated node startup/shutdown with dependency ordering
- State machine - system states with automatic on/off
- Priority-based velocity control - 6-level command arbitration (joystick > nav > AI)
- Dead-man switch - Hardware/software safety lock for all motion commands
- Emergency stop - Comprehensive e-stop sequence
Control Interfaces
- CockPit - Pre-configured control page with cameras, 3D model, map, and quick actions
- My UI - Drag-and-drop customizable dashboard with resizable widgets
- Virtual joysticks - Touch/mouse dual-stick control with velocity tuning
- Gamepad support - PS5, Xbox or generic joystick with custom button mapping and mode switching
- Keyboard control - WASD controls
- Speed profiles - Multiple velocity presets for different operational modes (Beginner, Normal and Insane mode)
- Robot actions - Stand/sit, lock/unlock, gait selection, lights, mode transitions
Camera & Video
- Multi-camera streaming - Dynamic discovery for front, rear, and custom topics
- H.264/H.265 codecs - Resolution scaling, frame rate control, bandwidth optimization
- In-browser recording - Record video from cameras and save them to your downloads folder
- 3D visualization - URDF-based robot model with laser scan overlay and navigation path
System Monitoring
- Jetson stats - Board model, JetPack version, power mode, uptime
- CPU/GPU monitoring - Per-core usage, frequency, memory, thermal throttling
- Power tracking - Per-rail voltage, current, and wattage with peak detection
- Thermals & fans - CPU/GPU/SOC temps with fan speed control
- Storage & memory - Disk usage alerts, RAM/swap monitoring
Networking & Fleet
- WiFi control panel - Network scanning, switching, and signal monitoring
- Connection modes - WiFi, Ethernet, 4G, hotspot with latency tracking
- Multi-robot fleet - Simultaneous connections, fleet-wide commands, status dashboard
- Diagnostics - Node health, error/warning logs, state machine visualization
Customization & UX
- Light/dark themes - Custom accent colors, persistent preferences
- Responsive layouts - Mobile, tablet, and desktop with touch support
- User profiles - Avatar, display name, theme color via Supabase Auth
- Multi-language - English and Portuguese with regional formats
- Audit logging - Searchable event history across 10+ categories with CSV export
- Activity analytics - Usage heatmaps and robot utilization tracking
Table of Contents
- Overview
- Project Structure
- Requirements
- Installation
- Frontend Development
- Features
- Configuration
- Custom Robots
- Troubleshooting
- Contributing
- License
Overview
BotBrain consists of three main components:
Hardware
A 3D printable enclosure with internal mounts designed to house an NVIDIA Jetson board and two Intel RealSense D435i cameras. The modular design allows you to attach BotBrain to various robot platforms without custom fabrication.
Frontend
A Next.js 15 web dashboard built with React 19 and TypeScript. It provides real-time robot control, camera streaming, map visualization, mission planning, system monitoring, and fleet
