Tonic
An autonomous vehicle written in python
Install / Use
/learn @mmajewsk/TonicREADME
An open-sourced project of autonomous car
Introduction • Documentation • Screenshots • Contribute


⚠️ ➡️ Video here ⬅️ ⚠️
a.k.a.: "Roomba, that does not suck"
Written in Python 🐍 (mostly).
</div>Contents:
Introduction
This repository contains main software and documentation for the Tonic project. This project aims to create an open-sourced autonomous driving system, along with its hardware prototype implementation. Some essential parts of the projects are contained in other related repos. See the list of related repos The core idea of how this should work is as follows:
- After setting up the robot/car, drive it manually, and dump the video and steering feed (this part is called data taking).
- Create a 3D mapping of the environment with Tonic/autonomous.
- Define checkpoints, through which the machine will drive.
- Program the car to drive on the defined paths.
All of that to be possible for as cheap as possible, with a raspberry PI and only a single camera.
Features
<img src="https://imgur.com/oA3ERWN.gif" width="45%" height="45%" align="right" />- Camera live feed and recording.
- Live steering system and recording.
- Working IMU live streaming and recording.
- Working odometry live streaming and recording.
- Qt GUI client for driving and data taking.
- SLAM mapping, and navigation implemented with ORB_SLAM2 and its custom fork, custom python bindings, and serialisation.
How does it work
As for now, this repository (mmajewsk/Tonic) contains guides and software for building, running and steering the car 🚘 for the data taking. The code is divided, into Tonic/control and Tonic/car.
The Tonic/control contains the code that is meant to be run on your laptop/pc/mac, that will control the raspberry pi Tonic/car.
The machine and control interface is communicating via WiFi network, using sockets.
Sensors, camera, and steering - each one is implemented as a separate service using sockets. You can steer it with the keyboard on PC, while seeing live feed from the camera. All of the sensors, steering and video can be dumped to files on PC. You don't need to turn all of the sensors to make this work.
The odometry and IMU are not necessary to make an environment mapping
Ok so how do I start
- Take a look at previous versions and the current one in video and screenshots.
- First start by assembling the hardware.
- Then set up the machine and interface software.
- Do the data-taking run, running steering and video data, as described here.
To make your machine drive autonomously, follow the guide in Tonic/autonomous repo.
Contribute
🧑🔧 This project is meant to be open for everyone. The contributions are welcome. If you would like to help see what's listed in the issues here, or add something yourself.
Also, you can join the 🗣️ discord server if you are looking for quick help, or just want to say hi ;)
Related repos
- My fork of ORB_SLAM2
- My fork of Osmap - Dumps ORB_SLAM2 to file
- My fork of PythonBindings - this one combines osmap with orb slam python bindings!
- TonicSlamDunk - Install scripts for all of the above, includes scripts for ubuntu, and dockerfile.
- ~~TonicOrange - Exemplary use of orb slam, for pathfinding~~ (moved to Tonic/autonomous)
Related Skills
node-connect
339.3kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
83.9kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
83.9kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
339.3kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
