Simultaneous Localization and Mapping: Through the Lens of Nonlinear Optimization

11 Dec 2021  ·  Amay Saxena, Chih-Yuan Chiu, Joseph Menke, Ritika Shrivastava, Shankar Sastry ·

Simultaneous Localization and Mapping (SLAM) algorithms perform visual-inertial estimation via filtering or batch optimization methods. Empirical evidence suggests that filtering algorithms are computationally faster, while optimization methods are more accurate. This work presents an optimization-based framework that unifies these approaches, and allows users to flexibly implement different design choices, e.g., the number and types of variables maintained in the algorithm at each time. We prove that filtering methods correspond to specific design choices in our generalized framework. We then reformulate the Multi-State Constrained Kalman Filter (MSCKF), implement the reformulation on challenging image sequence datasets in simulation, and contrast its performance with that of sliding window based filters. Using these results, we explain the relative performance characteristics of these two classes of algorithms in the context of our algorithm. Finally, we illustrate that under different design choices, the empirical performance of our algorithm interpolates between those of state-of-the-art approaches.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here