SkillAgentSearch skills...

LIMOncello

Iterated Error-State Kalman Filter on the SGal(3) Manifold for Fast LiDAR-Inertial Odometry.

Install / Use

/learn @CPerezRuiz335/LIMOncello
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

LIMOncello

arXiv

Another tightly coupled LiDAR‑Inertial SLAM algorithm? Yep, based on the existing algorithms FAST-LIO2, LIMO-Velo, Fast LIMO and DLIO, but cleaner, faster, and without IKFoM/ikd-Tree dependencies.

LIMOncello is essentially FAST-LIO 2 with the Fast LIMO (and thus DLIO) implementation, but without any dependency on the IKFoM or ikd-Tree C++ libraries. It implements an Iterated Error State Extended Kalman Filter (IESKF) as described in IKFoM and FAST-LIO 2, reimplemented using the manif library and a refactored version of the original iOctree data structure.

This is the result of my efforts to truly understand LIMO‑Velo and, consequently, FAST-LIO 2. Those insights led to an improvement in performance and accuracy: LIMOncello is the first known LIO‑SLAM system to incorporate the SGal(3) manifold in its state representation, and it uses iOctree, which is much faster and more efficient than iKd-Tree. On top of that, the code is portable, not as Fast LIMO, but as simplified as possible (the entire IKFoM library is synthesized into State.hpp thanks to manif, for example) so it’s accessible to anyone who wants to modify it.

<div align="center"> <img src="misc/CAT16x.gif" alt="CAT16X's performance" width="800"/> <small> <p> Formula Student race car CAT16X. Velocity in straights (~15m/s) and really tight turns (~120deg/s) (<a href="https://youtu.be/mk9U0lRWr-0?si=j4mM6e5dzihfCLJM">CAT15X</a>'s video for reference) </p> </small> </div> <br/>

Dependencies

LIMOncello is header-only and the core depends only on:

All Livox driver dependencies can be bypassed by compiling
RESPLE or by solely using its custom message
definitions.

Approach

To truly understand the concepts, here are the papers that greatly influenced this work, apart from Fast LIMO:

I should also mention S-FAST_LIO, which helped me grasp the initial intuition behind the math of IKFoM, and manif, which provides accessible, elegant Lie algebra tools for nearly any manifold used in robotics.

How To

Run the node by specifying the configuration name (without the .yaml extension):

ros2 launch limoncello limoncello.launch.py config_name:=cat16x

The launch file loads the configuration from <limoncello_package>/config/<config_name>.yaml. Optional launch arguments:

ros2 launch limoncello limoncello.launch.py config_name:=cat16x rviz:=true use_sim_time:=false

Most parameters are documented directly in each configuration file. However, special attention should be paid to those related to sensors.calibration, sensors.time_offset, sensors.extrinsics, and sensors.intrinsics.

First, note that the IMU reference frame is used as the robot reference frame in the Kalman filter. The IMU frame and LiDAR frame are both defined with respect to the base_link, which is an arbitrarily chosen reference location on the robot. For example, in wheeled vehicles this is often the projection of the center of gravity (CoG) onto the x–y plane, such that the LiDAR map places the floor on the x–y plane.

Under this convention, sensors.extrinsics.imu2baselink defines the initial pose (position and orientation) of the system at startup, meaning base_link coincides with the world origin at that moment. For instance, most IMUs follow the NEU convention, but they are often rotated 180 degrees about the x-axis, resulting in the z-axis pointing downward. In such case, that must be accounted for in sensors.extrinsics.imu2baselink.

sensors.intrinsics may be manually specified and are typically obtained from a static IMU calibration procedure, during which Allan noise parameters are also estimated—Allan noise parameters are used to configure IKFoM.covariance. If the IMU has not been calibrated, it is recommended to assign reasonably large values (but not exceeding 1) to IKFoM.covariance, and smaller values to sensors.intrinsics.

If poor initialization is observed, it is most likely due to incorrect initial bias estimation. In such cases, you may reduce the initial covariance value IKFoM.covariance.initial_cov to help stabilize the filter startup (this should be done cautiously), or calibrate the bias online using sensors.calibration, assuming the robot is stationary. When enabled, sensors.calibration.gravity_align attempts to refine the initial orientation defined in sensors.extrinsics.imu2baselink. Gravity alignment exploits the fact that gravity always points downward by detecting discrepancies between the expected gravity direction and the accelerometer measurements during static conditions.

Although the LiDAR and IMU may be physically misaligned, sensors.calibration.gravity_align applies the same rotation and translation correction to both sensors, assuming they are rigidly mounted to the same body. The relative extrinsics between the IMU and the LiDAR therefore remain fixed and are not re-estimated. As a result, the IMU (and thus the robot body frame) may not appear level with respect to the ground.

It is strongly discouraged to enable IKFoM.estimate_extrinsics. In most practical scenarios, online extrinsic estimation does not converge reliably and often leads to unstable or incorrect behavior. Providing accurate extrinsic parameters offline is the recommended approach.

sensors.time_offset does not need to be set when the LiDAR and IMU are not hardware synchronized. As long as both sensors timestamp messages using the same UTC time base, LIMOncello will handle the alignment internally. This parameter is primarily intended for situations where the sensors are not properly synchronized at the software level.

Finally, note that the estimated velocity is published with respect to base_link, not the IMU frame. This frame convention can be modified in include/ROSUtils.hpp if needed.

Configuration

Here, the configuration file for LIMOncello is explained.

| Parameter | Units | Summary | |----------------------------------|----------|---------------------------------------------------------------------------------------| | topics/input/lidar | – | ROS topic for incoming LiDAR point cloud | | topics/input/imu | – | ROS topic for incoming IMU data | | topics/input/stop_ioctree_update | – | ROS topic to trigger stopping of the iOctree updates (std_msgs/Bool) | | topics/output/state | – | ROS topic for published state estimates | | topics/output/frame | – | ROS topic for published full point cloud frame | | frame_id | – | Coordinate frame ID used for all published data | | verbose | – | Display execution time board | | debug | – | Publish intermediate point‑clouds (deskewed, processed, …) for visualization | | sensors/lidar/type | – | LiDAR model type (see config files) | | sensors/lidar/end_of_sweep | – | Whether sweep timestamp refers to end (true) or start (false) of scan | | sensors/imu/hz | Hz | IMU data rate | | calibration/gravity_align | – | If true, estimate gravity vector while robot is stationary | | calibration/accel | – | If true, estimate linear accelerometer bias while robot is stationary | | calibration/gyro | – | If true, estimate gyroscope bias while robot is stationary | | calibration/time | s | Duration for which robot must remain stationary to perform above calibrations | | time_offset | – | Whether to account for sync offset between IMU and LiDAR | | extrinsics/imu2baselink/t | m | Translation of IMU relative to base_link | | extrinsics/imu2baselink/R | deg | Rotation (roll, pitch, yaw) of IMU relative to base_link | | extrinsics/lidar2baselink/t | m | Translation of LiDAR relative to base_link

View on GitHub
GitHub Stars108
CategoryDevelopment
Updated6d ago
Forks15

Languages

C++

Security Score

95/100

Audited on Apr 1, 2026

No findings