SkillAgentSearch skills...

SLAM2REF

This project allows the alignment and correction of LiDAR-based SLAM session data with a reference map or another session, also the retrieval of 6-DoF poses with accuracy of up to 3 cm given an accurate TLS point cloud as a reference map (this map should be accurate at least regarding the position of permanent elements such as walls and columns).

Install / Use

/learn @MigVega/SLAM2REF

README

<h1 align="center" style="border-bottom: none"> <b> ⭐️ <img src=doc/imgs/SLAM2REF_LOGO7.png alt="SLAM2REF Logo" width="40%" /> ⭐️ <br> Long-Term Mapping with 3D LiDAR and Reference Map Integration </b> </h1> <p align="center"> Align and correct your LiDAR-based SLAM data with a reference map or a previous session. </p> <p align="center"> <a href="https://arxiv.org/abs/2408.15948"><b>ArXiv Paper (better images)</b></a> • <a href="https://link.springer.com/article/10.1007/s41693-024-00126-w"><b>Paper (nicer to read)</b></a> • <a href="https://mediatum.ub.tum.de/1743877"><b>Data</b></a> </p> <p align="center"> <a href="https://github.com/MigVega/SLAM2REF/actions/workflows/cmake-single-platform.yml"> <img src="https://github.com/MigVega/SLAM2REF/actions/workflows/cmake-single-platform.yml/badge.svg" alt="Build"> </a> <a href="https://arxiv.org/abs/2408.15948"> <img src="https://img.shields.io/badge/arXiv-2408.15948-%23B31C1B?style=flat" alt="arxiv"> </a> <a href="https://youtu.be/5WgPRRijI4Y"> <img src="https://img.shields.io/youtube/views/d_-ZYJhgGIk?label=YouTube&style=flat" alt="YouTube"> </a> <img src="https://img.shields.io/badge/C++-Solutions-blue.svg?style=flat&logo=c%2B%2B" alt="C++"> <img src="https://img.shields.io/github/license/MigVega/SLAM2REF" alt="License"> <a href="https://github.com/MigVega/SLAM2REF"> <img src="https://img.shields.io/github/stars/MigVega/SLAM2REF.svg?style=flat&logo=github&colorB=deeppink&label=stars" alt="GitHub stars"> </a> <a href="https://github.com/MigVega/SLAM2REF"> <img src="https://img.shields.io/github/forks/MigVega/SLAM2REF" alt="GitHub forks"> </a> <a href="https://github.com/MigVega/SLAM2REF"> <img src="https://img.shields.io/github/issues/MigVega/SLAM2REF" alt="GitHub issues"> </a> </p> <!-- TO ADD --> <!-- ![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dataset/slam2ref)(https://paperswithcode.com/sota)-->

What is SLAM2REF?

SLAM2REF uses pose-graph multi-session anchoring to align your LiDAR data with a reference map or with another session, allowing precise 6-DoF pose retrieval and map extension.

  • This project is an extension of LT-SLAM, which implements a custom GTSAM factor for anchoring (see BetweenFactorWithAnchoring.h). However, this project is completely ROS-independent. This is also an extension of the BIM-SLAM project, for which a video explanation is available.
  • Moreover, we have implemented a novel Indoor Scan Context Descriptor for fast place recognition, which is an extension of Scan Context. You can use this for fast global localization in indoor environments.
  • Also, a novel YawGICP algorithm for robust point cloud registration with varying mostly yaw angles, this one is a particular implementation of the Open 3D GICP. You can use this to register your sequential scans.
  • SLAM2REF additionally allows the retrieval of 6-DoF poses with an accuracy of up to 3 cm given an accurate TLS point cloud as a reference map (this map should be accurate, at least regarding the position of permanent elements such as walls and columns). These poses are precise enough to serve as ground truth for evaluating state-of-the-art SLAM, localization, or pose estimation algorithms. Additionally, they can be used to retrieve an accurately updated and aligned map with the reference map, for example, to update a digital twin of a facility.

The following image presents a very brief overview of how the method works.

<p align="center"><img src="doc/imgs/Gtihub_overview__.png" alt="SLAM2REF Github - Overview" width="85%" /></p>

How to Run the Code

Compatibility Note

This project has been tested only on Ubuntu 20.04. While it may potentially run on Ubuntu 22.04, it would require the latest version of GTSAM and adjustments in the BetweenFactorWithAnchoring class to be compatible.

0. Installing the Dependencies

  • We recommend installing GTSAM and Open 3D from the source. Moreover, you will need Open CV and PCL.

The commands to install all dependencies can be found in the file inside .github/workflows.

1. Cloning and Building the Project

  • Run the following in a terminal to clone the repository.
    cd
    mkdir -p Repos/00.SLAM2REF/code
    cd Repos/00.SLAM2REF/code
    git clone https://github.com/MigVega/SLAM2REF .
  • When building the project, use the -j 5 flag to limit the process to five threads. This helps prevent the project from exiting prematurely. You can do that with a IDE or with the following comands in a terminal.
    cd
    cd Repos/00.SLAM2REF/code
    mkdir build
    cd build
    cmake ..
    make -j 5

2. Setting Up Directory Structure

  • After a successful build, you need to set up the directory structure. Run the following commands in your terminal to create the necessary folders:

      cd
      mkdir -p Repos/00.SLAM2REF/data/outputs/00.TestA01_scans_in_BIM_ROI
      cd Repos/00.SLAM2REF/data
    

3. Download and Prepare Sample Data

  • You can download sample data from this link and unzip the file inside the Repos/00.SLAM2REF/data directory. Ensure that the "input" folder is more than 500 MB in size to confirm a successful download.

4. Configure the Project

  • Open the config/params.yaml file and replace the three occurrences of mlegion with your Linux username.

5. Running the Code

  • Start the execution by running the code. You should see the following in the console:
        ----> Slam2ref starts.
    
  • Optional: The final ICP step can significantly extend execution time (approximately 22 minutes). By default, it is deactivated to speed up the process. However, if you wish to refine the poses to centimeter accuracy, you can enable this step in a subsequent run.
    • To activate the final ICP step, open the config/params.yaml file and set the using_MV_performing_final_ICP parameter to true:

      using_MV_performing_final_ICP: true
      

6. Monitoring Execution and Output

  • The program's execution time varies depending on whether the final ICP step is included:

    • Without final ICP: Approximately less than 1 minute.
    • With final ICP: Approximately 22 minutes.
  • Once the program has finished, you will see the following message in the console:

    ----> Slam2ref done.
    
  • Check the output files in CloudCompare (all should be in /home/[your-username]/Repos/00.SLAM2REF/data/outputs/00.TestA01_scans_in_BIM_ROI/TestA01). Among the .txt files, only the ones with CC can be opened directly in CloudCompare.

  • For instance, the file !DLIO_real_world_SLAM_session_central_aft_KNN_intersession_loops_CC.txt contains the poses in point cloud format of the real-world session after the ISC and KNN loops, which are aligned with the BIM model. In contrast, the original poses before these adjustments are in !DLIO_real_world_SLAM_session_central_bfr_intersession_loops_CC.txt. Additionally, you can view the original session poses from the BIM model in !BIM_session_scans_296-422_central_bfr_intersession_loops_CC.txt.

    • Tip: Increase the point size in your viewer to better visualize the poses.
  • If you performed the final ICP step, you should also have a folder named "Part7_final_ICP". Inside, the file !00_CENTRAL_SESSION_after_FinalICP_CC.txt contains the further refined poses. In this file:

    • Blue denotes poses considered good,
    • Green denotes perfect poses,
    • Red denotes poses that are poor,
    • Black denotes scans outside the reference map where registration was not possible.
  • The file !FINAL_source_cloud_TOTAL.pcd provides the map of the real-world session reconstructed using only the good and perfectly registered scans. You may need to colorize this map for better visualization. Compare this reconstructed map with the reference map located at /home/[your-username]/Repos/00.SLAM2REF/data/inputs/01.Ref_maps_pcs.

Stay Up-to-Date / Support the Project

  • Upcoming Features: We will be adding more code soon, including functionality to generate session data from the reference map.
  • Stay Informed: To keep up with the latest updates and developments, make sure to watch and star the repository!

Your support helps us continue improving the project. Thank you for being part of our community!

<p align="center"><img src="doc/imgs/github_start_only.gif" alt="SLAM2REF Github - how to star the repo" width="65%" /></p>

License

  • Academic Use: The code is available under the [GPLv3 License](https://www.gn
View on GitHub
GitHub Stars132
CategoryDevelopment
Updated16d ago
Forks5

Languages

C++

Security Score

100/100

Audited on Mar 15, 2026

No findings