SkillAgentSearch skills...

Sptam

S-PTAM: Stereo Parallel Tracking and Mapping

Install / Use

/learn @lrse/Sptam
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

S-PTAM is a Stereo SLAM system able to compute the camera trajectory in real-time. It heavily exploits the parallel nature of the SLAM problem, separating the time-constrained pose estimation from less pressing matters such as map building and refinement tasks. On the other hand, the stereo setting allows to reconstruct a metric 3D map for each frame of stereo images, improving the accuracy of the mapping process with respect to monocular SLAM and avoiding the well-known bootstrapping problem. Also, the real scale of the environment is an essential feature for robots which have to interact with their surrounding workspace.

<a href="http://www.youtube.com/watch?feature=player_embedded&v=ojBB07JvDrY " target="_blank"><img src="http://img.youtube.com/vi/ojBB07JvDrY/0.jpg" alt="IMAGE ALT TEXT HERE" width="560" height="315" border="0" /></a>
(Click the image to redirect to S-PTAM video)

Related Publications:

[1] Taihú Pire,Thomas Fischer, Gastón Castro, Pablo De Cristóforis, Javier Civera and Julio Jacobo Berlles. S-PTAM: Stereo Parallel Tracking and Mapping Robotics and Autonomous Systems, 2017.

[2] Taihú Pire, Thomas Fischer, Javier Civera, Pablo De Cristóforis and Julio Jacobo Berlles.
Stereo Parallel Tracking and Mapping for Robot Localization
Proc. of The International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 2015.

Table of Contents

License

S-PTAM is released under GPLv3 license.

For a closed-source version of S-PTAM for commercial purposes, please contact the authors.

If you use S-PTAM in an academic work, please cite:

@article{pire2017sptam,
          title = {{S-PTAM: Stereo Parallel Tracking and Mapping}},
          author = {Pire, Taih{\'u} and Fischer, Thomas and Castro, Gast{\'o}n and De Crist{\'o}foris, Pablo and Civera, Javier and Jacobo Berlles, Julio},
          journal = {Robotics and Autonomous Systems (RAS)},
          volume = {93},
          pages = {27 -- 42},
          year = {2017},
          issn = {0921-8890},
          doi = {10.1016/j.robot.2017.03.019}
}

@inproceedings{pire2015sptam,
          title={{Stereo Parallel Tracking and Mapping for robot localization}},
          author={Pire, Taih{\'u} and Fischer, Thomas and Civera, Javier and De Crist{\'o}foris, Pablo and Jacobo berlles, Julio},
          booktitle={Proc. of the International Conference on Intelligent Robots and Systems (IROS)},
          pages = {1373--1378},
          year={2015},
          month = {September},
          doi = {10.1109/IROS.2015.7353546}
}

Disclaimer

This site and the code provided here are under active development. Even though we try to only release working high quality code, this version might still contain some issues. Please use it with caution.

Dependencies

ROS

We have tested S-PTAM in Ubuntu 16.04 with ROS Kinetic.

To install ROS (Kinetic) use the following command:

sudo apt-get install ros-kinetic-desktop

SuiteSparse

Suitespare is a dependency, so it needs to be installed

sudo apt-get install libsuitesparse-dev

ros-utils

Install our ros-utils library from the source code provided in

git clone git@github.com:lrse/ros-utils.git

g2o

Install g2o library from the source code provided in

git clone git@github.com:RainerKuemmerle/g2o.git

Tested until commit 4b9c2f5b68d14ad479457b18c5a2a0bce1541a90

git checkout 4b9c2f5b68d14ad479457b18c5a2a0bce1541a90

mkdir build && cd build
cmake ..
make 
sudo make install

Loop Closure dependencies

Only required when USE_LOOPCLOSURE flag is defined.

DBoW2 vocabularies are available through a git submodule at the bow_voc directory

git submodule update --init --recursive

DLib

Install DLib library from source code

git clone git@github.com:dorian3d/DLib.git

Tested until commit 70089a38056e8aebd5a2ebacbcb67d3751433f32

git checkout 70089a38056e8aebd5a2ebacbcb67d3751433f32

DBoW2

Install DBoW2 library from source code

git clone git@github.com:dorian3d/DBoW2.git

Tested until commit 82401cad2cfe7aa28ee6f6afb01ce3ffa0f59b44

git checkout 82401cad2cfe7aa28ee6f6afb01ce3ffa0f59b44

DLoopDetector

Install DLoopDetector library from source code

git clone git@github.com:dorian3d/DLoopDetector.git

Tested until commit 8e62f8ae84d583d9ab67796f779272b0850571ce

git checkout 8e62f8ae84d583d9ab67796f779272b0850571ce

OpenGV

Install OpenGV library from source code

git clone git@github.com:laurentkneip/opengv.git

Tested until commit 2e2d21917fd2fb75f2134e6d5be7a2536cbc7eb1

git checkout 2e2d21917fd2fb75f2134e6d5be7a2536cbc7eb1

Installation

git clone git@github.com:lrse/sptam.git

ROS Package

ROS Compilation

catkin_make --pkg sptam -DCMAKE_BUILD_TYPE=RelWithDebInfo -DSINGLE_THREAD=OFF -DSHOW_TRACKED_FRAMES=ON -DSHOW_PROFILING=ON -DPARALLELIZE=ON

To activate Loop Closing capabilities (requires DBoW2 and OpenGV dependencies).

catkin_make --pkg sptam -DCMAKE_BUILD_TYPE=RelWithDebInfo -DUSE_LOOPCLOSURE=ON -DSINGLE_THREAD=OFF -DSHOW_TRACKED_FRAMES=ON -DSHOW_PROFILING=ON -DPARALLELIZE=ON

For more information about compilation flags see CMAKE flags section.

Tutorials

We provide some examples of how to run S-PTAM with the most popular stereo datasets

KITTI dataset

  1. Download the KITTI rosbag kitti_00.bag provided in KITTI rosbag files

  2. Uncompress the dataset

    rosbag decompress kitti_00.bag

  3. Set use_sim_time ros variable true

    rosparam set use_sim_time true

  4. Play the dataset

    rosbag play --clock kitti_00.bag

    (When S-PTAM run with the flag SHOW_TRACKED_FRAMES=ON the performance is reduced notoriusly).

  5. Run sptam using the kitti.launch

    roslaunch sptam kitti.launch

EuRoc MAV dataset

  1. Download the EuRoc rosbag Machine Hall 01 provided in EuRoc MAV Web Page

  2. Add left and right camera_info messages in the rosbag

    In S-PTAM package we provide a script euroc_add_camera_info.py to add left and right sensor_msgs/CameraInfo messages to the EuRoc MAV rosbags.

    python sptam_directory/scripts/euroc_add_camera_info.py MH_01_easy.bag /mav0/cam0/sensor.yaml /mav0/cam1/sensor.yaml

  3. Set use_sim_time ros variable true

    rosparam set use_sim_time true

  4. Play the dataset

    rosbag play --clock MH_01_easy_with_camera_info.bag -s 50

  5. Run sptam using the euroc.launch

    roslaunch sptam euroc.launch

MIT Stata Center dataset

  1. Download the MIT Stata Center rosbag 2012-01-27-07-37-01.bag provided in MIT Stata Center Web Page

  2. Set use_sim_time ros variable true

    rosparam set use_sim_time true

  3. Play the dataset

    rosbag play --clock 2012-01-27-07-37-01.bag -s 302.5 -u 87

    (Here we are running the part 3 of the sequence where ground-truth was provided that is why the bag file start from a different timest

View on GitHub
GitHub Stars374
CategoryDevelopment
Updated14d ago
Forks137

Languages

C++

Security Score

80/100

Audited on Mar 18, 2026

No findings