RoboCrew
π¦ΎMake your robot autonomous with LLM agent. Set it up with the same ease as normal agents in CrewAI or Autogen
Install / Use
/learn @Grigorij-Dudnik/RoboCrewREADME

Create LLM agent for your robot. Connect movement tools, VLA policies and sensor scans just in a few lines of code.
<p align="center"> <img src="https://raw.githubusercontent.com/Grigorij-Dudnik/RoboCrew-assets/master/Demo_videos/robocrew_v3_9fps.gif" alt="RoboCrew demo" width="700"> </p> <p align="center"><em>RoboCrew agent cleaning up a table.</em></p>π Quick Start
Run on your robot:
pip install robocrew
Start GUI app with:
robocrew-gui
β¨ Features

- π Movement - Pre-built wheel controls for mobile robots
- π¦Ύ Manipulation - VLA models as tools for arms control
- ποΈ Vision - Camera feed with image augmentation for better spatial understanding
- π€ Voice - Wake-word activated voice commands and TTS responses
- πΊοΈ LiDAR - Top-down mapping with LiDAR sensor
- π§ Intelligence - Multi-agent control provides autonomy in decision making
π¨ Supported Robots
- β XLeRobot - Full support for all features
- π₯ LeKiwi - Use XLeRobot code (compatible platform)
- π Earth Rover mini plus - Full support
- π More robot platforms coming soon! Request your platform β
π― How It Works
<div align="center"> <img src="https://raw.githubusercontent.com/Grigorij-Dudnik/RoboCrew-assets/master/Images/robot_agent.png" alt="How It Works Diagram" width="400"> </div>The RoboCrew Intelligence Loop:
- π Input - Voice commands, text tasks, or autonomous operation
- π§ LLM Processing - LLM analyzes the task and environment...
- π οΈ Tool Selection - ...and chooses appropriate tools (move, turn, grab an apple, etc.)
- π€ Robot Actions - Wheels and arms execute commands
- πΉ Visual Feedback - Cameras capture results with augmented overlay
- π Repeat - LLM evaluates results and adjusts strategy
π± Scripts to Use:
To gain full control over RoboCrew features, you can create your own script. Simplest example:
from robocrew.core.camera import RobotCamera
from robocrew.core.LLMAgent import LLMAgent
from robocrew.robots.XLeRobot.tools import create_move_forward, create_turn_right, create_turn_left
from robocrew.robots.XLeRobot.servo_controls import ServoControler
# π· Set up main camera
main_camera = RobotCamera("/dev/camera_center") # camera usb port Eg: /dev/video0
# ποΈ Set up servo controller
right_arm_wheel_usb = "/dev/arm_right" # provide your right arm usb port. Eg: /dev/ttyACM1
servo_controler = ServoControler(right_arm_wheel_usb=right_arm_wheel_usb)
# π οΈ Set up tools
move_forward = create_move_forward(servo_controler)
turn_left = create_turn_left(servo_controler)
turn_right = create_turn_right(servo_controler)
# π€ Initialize agent
agent = LLMAgent(
model="google_genai:gemini-3-flash-preview",
tools=[move_forward, turn_left, turn_right],
main_camera=main_camera,
servo_controler=servo_controler,
)
# π― Give it a task and go!
agent.task = "Approach a human."
agent.go()
π€ Enable Listening and Speaking
Use voice to tell robot what to do.
π Docs: https://grigorij-dudnik.github.io/RoboCrew-docs/guides/examples/audio/
π» Code example: examples/2_xlerobot_listening_and_speaking.py
π¦Ύ Add VLA Policy as a Tool
Let's make our robot manipulate objects with its arms!
π Docs: https://grigorij-dudnik.github.io/RoboCrew-docs/guides/examples/vla-as-tools/
π» Code example: examples/3_xlerobot_arm_manipulation.py
π§ Increase intelligence with multiagent communication:
One agent plans mission, another controls robot.
π Docs: https://grigorij-dudnik.github.io/RoboCrew-docs/guides/examples/multiagent/
π» Code example: examples/4_xlerobot_multiagent_cooperation.py
π¬ Community & Support
- π Join our Discord - Get help, share projects, discuss features
- π Read the Docs - Comprehensive guides and API reference
- π Report Issues - Found a bug? Let us know!
- β Star on GitHub - Show your support!
β€οΈ Special thanks to all contributors and early adopters!
