View Chapter

Chapter 46 — Simultaneous Localization and Mapping

Cyrill Stachniss, John J. Leonard and Sebastian Thrun

This chapter provides a comprehensive introduction in to the simultaneous localization and mapping problem, better known in its abbreviated form as SLAM. SLAM addresses the main perception problem of a robot navigating an unknown environment. While navigating the environment, the robot seeks to acquire a map thereof, and at the same time it wishes to localize itself using its map. The use of SLAM problems can be motivated in two different ways: one might be interested in detailed environment models, or one might seek to maintain an accurate sense of a mobile robot’s location. SLAM serves both of these purposes.

We review the three major paradigms from which many published methods for SLAM are derived: (1) the extended Kalman filter (EKF); (2) particle filtering; and (3) graph optimization. We also review recent work in three-dimensional (3-D) SLAM using visual and red green blue distance-sensors (RGB-D), and close with a discussion of open research problems in robotic mapping.

Pose graph compression for laser-based SLAM 2

Author  Cyrill Stachniss

Video ID : 450

This video illustrates pose graph compression, a technique for achieving long-term SLAM, as discussed in Chap. 46.5, Springer Handbook of Robotics, 2nd edn (2016). Reference: H. Kretzschmar, C. Stachniss: Information-theoretic compression of pose graphs for laser-based SLAM. Reference: Int. J. Robot. Res. 31(11), 1219-1230 (2012).

Pose graph compression for laser-based SLAM 3

Author  Cyrill Stachniss

Video ID : 451

This video illustrates pose graph compression, a technique for achieving long-term SLAM, as discussed in Chap.46.5, Springer Handbook of Robotics, 2nd edn (2016). Reference: H. Kretzschmar, C. Stachniss: Information-theoretic compression of pose graphs for laser-based SLAM, Int. J. Robot. Res. 31(11), 1219-1230 (2012).

DTAM: Dense tracking and mapping in real-time

Author  Richard Newcombe

Video ID : 452

This video shows DTAM: Dense tracking and mapping in real-time, a system for real-time, fully-dense visual tracking and reconstruction, described in Chap. 46.4, Springer Handbook of Robotics, 2nd edn (2016). Reference: R.A. Newcombe, S.J. Lovegrove, A.J. Davison: DTAM: Dense tracking and mapping in real-time. Int. Conf. Computer Vision (ICCV),, Barcelona (2011), pp. 2320–2327

MonoSLAM: Real-time single camera SLAM

Author  Andrew Davison

Video ID : 453

This video describes MonoSLAM, an influential early real-time, single-camera, visual SLAM system, described in Chap. 46.4, Springer Handbook of Robotics, 2nd edn (2016). Reference: A.J. Davison, I. Reid, N. Molton, O. Stasse: MonoSLAM: Real-time single camera SLAM, IEEE Trans. Pattern Anal. Mach. Intel. 29(6), 1052-1067 (2007).

SLAM++: Simultaneous localization and mapping at the level of objects

Author  Andrew Davison

Video ID : 454

This video describes SLAM++, an object-based, 3-D SLAM system. Reference. R.F. Salas-Moreno, R.A. Newcombe, H. Strasdat, P.H.J. Kelly, A.J. Davison: SLAM++: Simultaneous localisation and mapping at the level of objects, Proc. IEEE Int. Conf. Computer Vision Pattern Recognition, Portland (2013).

Extended Kalman-filter SLAM

Author  John Leonard

Video ID : 455

This video shows an illustration of Kalman filter SLAM, as described in Chap. 46.3.1, Springer Handbook of Robotics, 2nd edn (2016). References: J.J. Leonard, H. Feder: A computationally efficient method for large-scale concurrent mapping and localization, Proc. Int. Symp. Robot. Res. (ISRR), Salt Lake City (2000), pp. 169–176.