SkillAgentSearch skills...

Ogm2Pgbm

Robust BIM-based 2D-LiDAR Localization for Lifelong Indoor Navigation in Changing and Dynamic Environments

Install / Use

/learn @MigVega/Ogm2Pgbm
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<h1 align="center" style="border-bottom: none"> <b> ⭐️ OGM2PGBM ⭐️ <br> Occupancy Grid Map to Pose Graph-based Map for long-term 2D LiDAR-based localization </b> </h1> <p align="center"> Achieve a 78% improvement in real-time localization accuracy by transitioning from classical particle filter methods to robust pose graph-based algorithms using OGM2PGBM. This approach enables you to leverage any existing 2D reference map for enhanced precision and reliability. </p> <p align="center"> <a href="https://arxiv.org/abs/2308.05443"><b>ArXiv Paper</b></a> • <a href="https://mediatum.ub.tum.de/1749236"><b>Data</b></a> </p> <p align="center"> <a href="https://arxiv.org/abs/2408.15948"> <img src="https://img.shields.io/badge/arXiv-2308.05443-%23B31C1B?style=flat" alt="arxiv"> </a> <a href="https://doi.org/10.5281/zenodo.513174972"> <img src="https://zenodo.org/badge/513174972.svg" alt="DOI"> </a> <img src="https://img.shields.io/github/license/MigVega/Ogm2Pgbm" alt="License"> <a href="https://github.com/MigVega/SLAM2REF"> <img src="https://img.shields.io/github/stars/MigVega/Ogm2Pgbm.svg?style=flat&logo=github&colorB=deeppink&label=stars" alt="GitHub stars"> </a> <a href="https://github.com/MigVega/SLAM2REF"> <img src="https://img.shields.io/github/forks/MigVega/Ogm2Pgbm" alt="GitHub forks"> </a> <a href="https://github.com/MigVega/SLAM2REF"> <img src="https://img.shields.io/github/issues/MigVega/Ogm2Pgbm" alt="GitHub issues"> </a> </p>

This repo contains the following two applications:

  1. OGM2PGBM: generate pose graph-based maps on 2D occupancy grid maps, which can be created from a TLS Point cloud or a BIM/CAD model. This pose graph-based maps can be used for accurate localization in changing and dynamic environments, as demostrated in our [paper][paper].

    The following animation shows an overview of the method and compares AMCL and Cartographer; the latter can be applied after leveraging the Ogm2Pgbm package. It is clear that for AMCL is more difficult to keep track of the pose of the robot in comparison with the performance of Cartographer.

    video

  2. GMCL & CARTO/SLAM_toolbox: conbine the fast global localization feature of GMCL with the more accurate pose tracking performance of Cartographer/SLAM_toolbox

Additionally, it includes the packages amcl, gmcl, cartographer and slam_toolbox, so that they can be used and compared with a bagfile that should be located in the mounted directory ~/workspace easily.

Table of Contents

Requirements

If you plan to use our docker container, with all the methods installed (Warning: it requires >5 GB of space) you only need to install [docker][docker].

If you don't want to use docker and only want to use the Ogm2Pgbm package, you can install the content of this folder and use it as a normal catkin package.

If you still want to install all the different localization methods, you can have a look at the [docker file][docker_file] and install the respective dependencies on your local machine.

OGM2PGBM

Principle

The workflow of OGM2PGBM is as follows, see the function new_map_callback(self, grid_map) for details:

  1. Subscribe map from the map topic
  2. Skeletonize the map and get its voronoi waypoint (see self.skeletonize())
  3. Perform a coverage path planning on it (see self.CPP())
    • Extract the farest endpoint pair first as the start and goal point
    • Then dilate it with a 2x2 kernel to bold the centerline
    • Do CPP wavefront algorithm and get the sorted waypoint
  4. Do a raytracing on the waypoints one by one, and publish the /laserscan topic (see self.raytracer())

It produces /tf, /clock, /odom, /scan topics with frame robot_map, robot_odom and robot_base_link.

Since /tf is needed, python2.7 is used in this script.

Running the code

This package already provides a Dockerfile, which means you can run it on any Ubuntu version. However, for ROS2 users, the final step of generating .pbstream or .posegraph is different. I will explain this in detail below.

This might also be a way to edit the posegraph map, but it's a bit convoluted.

I. Refine and Edit the PGM Map

Modify the map as needed by adding or removing barriers. The most crucial step is to completely black out the obstacles, ensuring there are no white areas within any obstacles!

|Before|After| |:-:|:-:| |截图+2024-05-09+13-30-02|image|

II. Use Ogm2Pgbm to Obtain a Rosbag with PointCloud and Poses

2.1 Use Docker to Obtain the Package
  1. First clone the repository

    git clone https://github.com/MigVega/Ogm2Pgbm.git
    cd Ogm2Pgbm
    
  2. Pull the Docker image

    LihanChen has build a Docker image for easy deployment. You can also choose to build your own Docker image.

    docker pull lihanchen2004/ogm2pgbm:latest
    
  3. Create the container based on docker image

    If you are using a self-built image, please modify the image_name at the beginning of autorun.sh.

    ./autorun.sh
    
2.2 Start the Package

Copy the .pgm and .yaml files you need to convert to the Ogm2Pgbm/workspace/map/ directory on the host machine.

All following operations should be performed inside the Docker container.
Ensure the current terminal is within the Docker container.

MAP_NAME=OGM_empty

roslaunch ogm2pgbm ogm2pgbm.launch map_file:=/root/workspace/map/$MAP_NAME.yaml record:=true

Wait for 2-3 minutes. After completion, the program will output "Done" in the terminal.

Terminate the program with Ctrl+C. The rosbag will automatically be saved to /root/.ros/ogm2pgbm_sensordata.bag.

III. For ROS1 users

If you want to use posegraph or pbstream file directly in ROS1, please follow the tutorial in this section.

After generating bagfiles, use Cartographer to generate pbstream or SLAM toolbox to generate posegraph maps. With the following command Cartographer will run in offline mode, which will generate pbstream quite fast, but without any visual output in rviz.

roslaunch cartographer_ros ogm2pgbm_my_robot.launch bag_filename:=/root/.ros/ogm2pgbm_sensordata.bag

You can also launch Slam_toolbox. (There will be some error report in the terminal, just ignore them and wait for some seconds.)

roslaunch slam_toolbox ogm2pgbm.launch bag_filename:=/root/.ros/ogm2pgbm_sensordata.bag

ogm2pgbm_posegraph

The target pbstream file will be generated automatically at /root/.ros/ogm2pgbm_sensordata.bag.pbstream after . For slam_toolbox, you also need to click on the serialization button on the rviz plugin. The target files are also located at /root/.ros.

IV. For ROS2 users

If you want to use posegraph or pbstream file in ROS2, please follow the tutorial in this section.

All following operations should be performed on the host machine.

4.1 Download the Rosbag to the Host Machine
  1. Get the CONTAINER ID of ogm2pgbm:

    docker ps
    
  2. Save the rosbag to the host machine:

    Remember to modify CONTAINER_ID! Here, I save the rosbag from the container to the host's Download directory:

    CONTAINER_ID=xxx
    CONTAINER_PATH=/root/.ros/ogm2pgbm_sensordata.bag
    DST_PATH=~/Downloads/
    
    docker cp $CONTAINER_ID:$CONTAINER_PATH $DST_PATH
    
4.2 Convert .bag to .db3

In ROS1, rosbag files have a .bag suffix, which is a binary format for storing ROS messages. ROS2 has improved and extended the rosbag format

View on GitHub
GitHub Stars232
CategoryDevelopment
Updated1d ago
Forks35

Languages

C++

Security Score

100/100

Audited on Mar 30, 2026

No findings