A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching

Loading...
Thumbnail Image
Journal Title
Journal ISSN
Volume Title
Conference article in proceedings
Date
2022
Major/Subject
Mcode
Degree programme
Language
en
Pages
Series
Proceedings of the 25th International Conference on Information Fusion, FUSION 2022
Abstract
The fusion of camera sensor and inertial data is a leading method for ego-motion tracking in autonomous and smart devices. State estimation techniques that rely on nonlinear filtering are a strong paradigm for solving the associated information fusion task. The de facto inference method in this space is the celebrated extended Kalman filter (EKF), which relies on first-order linearizations of both the dynamical and measurement model. This paper takes a critical look at the practical implications and limitations posed by the EKF, especially under faulty visual feature associations and the presence of strong confounding noise. As an alternative, we revisit the assumed density formulation of Bayesian filtering and employ a moment matching (unscented Kalman filtering) approach to both visual-inertial odometry and visual SLAM. Our results highlight important aspects in robustness both in dynamics propagation and visual measurement updates, and we show state-of-the-art results on EuRoC MAV drone data benchmark.
Description
Publisher Copyright: © 2022 International Society of Information Fusion.
Keywords
Other note
Citation
Solin , A , Li , R & Pilzer , A 2022 , A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching . in Proceedings of the 25th International Conference on Information Fusion, FUSION 2022 . International Society of Information Fusion , International Conference on Information Fusion , Linkoping , Sweden , 04/07/2022 . https://doi.org/10.23919/FUSION49751.2022.9841259