SPS
Generalizable Stable Points Segmentation for 3D LiDAR Scan-to-Map Long-Term Localization
Install / Use
/learn @ibrahimhroob/SPSREADME
<i>Our stable points segmentation prediction for three datasets. The stable points are depicted in black, while the unstable points are represented in red.</i>
</details> </div>Building the Docker image
We provide a Dockerfile and a docker-compose.yaml to run all docker commands.
IMPORTANT To have GPU access during the build stage, make nvidia the default runtime in /etc/docker/daemon.json:
```yaml
{
"runtimes": {
"nvidia": {
"path": "/usr/bin/nvidia-container-runtime",
"runtimeArgs": []
}
},
"default-runtime": "nvidia"
}
```
Save the file and run ```sudo systemctl restart docker``` to restart docker.
To build the image, simply type the following in the terminal:
bash build_docker.sh
Once the build process finishes, initiate the Docker container in detached mode using Docker Compose from the project directory:
docker-compose up -d # or [docker compose up -d] for older versions
Usage Instructions
Training
To train the model with the parameters specified in config/config.yaml, follow these steps:
-
Export the path to the dataset (This step may be necessary before initiating the container):
export DATA=path/to/dataset -
Initiate training by executing the following command from within the container:
python scripts/train.py
Segmentation Metrics
To evaluate the segmentation metrics for a specific sequence:
python scripts/predict.py -seq <SEQ ID>
This command will generate reports for the following metrics:
- uIoU (unstable points IoU)
- Precision
- Recall
- F1 score
Localization
Install and build the following packages in your catkin_ws:
cd </path/to/catkin_ws>/src
git clone https://github.com/koide3/ndt_omp
git clone https://github.com/SMRT-AIST/fast_gicp --recursive
git clone https://github.com/koide3/hdl_global_localization
git clone --branch SPS https://github.com/ibrahimhroob/hdl_localization.git
cd ..
catkin build
source devel/setup.bash
Then, the localization experiment can be run using a single command:
bash exp_pipeline/loc_exp_general.bash
In order to calculate the localization metrics please install evo library
Data
You can download the post-processed and labelled BLT dataset and the parking lot of NCLT dataset from the proveded links.
The weights of our pre-trained model can be downloaded as well.
Here the general structure of the dataset:
DATASET/
├── maps
│ ├── base_map.asc
│ ├── base_map.asc.npy
│ └── base_map.pcd
└── sequence
├── SEQ
│ ├── map_transform
│ ├── poses
| | ├── 0.txt
| | └── ...
│ └── scans
| ├── 0.npy
| └── ...
|
└── ...
Publication
If you use our code in your academic work, please cite the corresponding paper:
@article{hroob2024ral,
author = {I. Hroob* and B. Mersch* and C. Stachniss and M. Hanheide},
title = {{Generalizable Stable Points Segmentation for 3D LiDAR Scan-to-Map Long-Term Localization}},
journal = {IEEE Robotics and Automation Letters (RA-L)},
volume = {9},
number = {4},
pages = {3546-3553},
year = {2024},
doi = {10.1109/LRA.2024.3368236},
}
Acknowledgments
This implementation is inspired by 4DMOS.
License
This project is free software made available under the MIT License. For details see the LICENSE file.
Related Skills
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
groundhog
400Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
last30days-skill
19.1kAI agent skill that researches any topic across Reddit, X, YouTube, HN, Polymarket, and the web - then synthesizes a grounded summary
