Cyrill Stachniss, John J. Leonard and Sebastian Thrun
This chapter provides a comprehensive introduction in to the simultaneous localization and mapping problem, better known in its abbreviated form as SLAM. SLAM addresses the main perception problem of a robot navigating an unknown environment. While navigating the environment, the robot seeks to acquire a map thereof, and at the same time it wishes to localize itself using its map. The use of SLAM problems can be motivated in two different ways: one might be interested in detailed environment models, or one might seek to maintain an accurate sense of a mobile robot’s location. SLAM serves both of these purposes.
We review the three major paradigms from which many published methods for SLAM are derived: (1) the extended Kalman filter (EKF); (2) particle filtering; and (3) graph optimization. We also review recent work in three-dimensional (3-D) SLAM using visual and red green blue distance-sensors (RGB-D), and close with a discussion of open research problems in robotic mapping.
MonoSLAM: Real-time single camera SLAM
Author Andrew Davison
Video ID : 453
This video describes MonoSLAM, an influential early real-time, single-camera, visual SLAM system, described in Chap. 46.4, Springer Handbook of Robotics, 2nd edn (2016).
Reference: A.J. Davison, I. Reid, N. Molton, O. Stasse: MonoSLAM: Real-time single camera SLAM, IEEE Trans. Pattern Anal. Mach. Intel. 29(6), 1052-1067 (2007).