Robocentric visualinertial odometry

Z Huai, G Huang - The International Journal of Robotics …, 2022 - journals.sagepub.com
… novel robocentric formulation of the visualinertial navigation … an efficient, lightweight,
robocentric visualinertial odometry (R-… As an immediate advantage of this robocentric

Square-root robocentric visual-inertial odometry with online spatiotemporal calibration

Z Huai, G Huang - IEEE Robotics and Automation Letters, 2022 - ieeexplore.ieee.org
… efficient approach for realizing robocentric visual-inertial odometry which offers consistent
observability properties for probabilistic state estimator. Our robocentric models, including IMU …

Learning Visual-Inertial Odometry With Robocentric Iterated Extended Kalman Filter

KD Nguyen, DT Tran, VQ Pham, DT Nguyen… - IEEE …, 2024 - ieeexplore.ieee.org
… to Visual-Inertial Odometry, we enhance the ego-motion estimation system by incorporating
a robocentric … Subsequently, we dig further into the formulation of the robocentric IEKF in …

Towards end-to-end learning of visual inertial odometry with an EKF

C Li, SL Waslander - 2020 17th conference on computer and …, 2020 - ieeexplore.ieee.org
… In this paper, we propose the first end-to-end trainable visual-inertial odometry (VIO) …
an end-to-end trainable method for Visual Inertial Odometry. Our approach used a robo-centric

Robust visual inertial odometry using a direct EKF-based approach

M Bloesch, S Omari, M Hutter… - 2015 IEEE/RSJ …, 2015 - ieeexplore.ieee.org
… adapt the structure of the standard visual-inertial EKF-SLAM for… A purely robocentric
representation of the full filter state is … fully robocentric and direct monocular visual-inertial

[BOOK][B] Robocentric Visual-Inertial Localization and Mapping

Z Huai - 2023 - search.proquest.com
… the robocentric VINS formulation as opposed to the commonly-used world-centric formulation,
and develop the corresponding robocentric visual-inertial odometry (R-… -based robocentric

Cooperative visual-inertial odometry

P Zhu, Y Yang, W Ren, G Huang - 2021 ieee international …, 2021 - ieeexplore.ieee.org
… of multi-robot cooperative visual-inertial localization where each robot is equipped with
only a single camera and IMU. We develop two cooperative visual-inertial odometry (C-VIO) …

Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback

M Bloesch, M Burri, S Omari… - … Journal of Robotics …, 2017 - journals.sagepub.com
visual-inertial odometry. All in all, this paper describes a fully robocentric and direct visual-inertial
odometry … Our framework, which we refer to as Rovio (robust visual-inertial odometry), …

[PDF][PDF] Visual-inertial state estimation with decoupled error and state representations

C Chen, Y Peng, G Huang - Proceedings of the …, 2024 - algorithmic-robotics.org
Robocentric Formulation The robocentric VINS is … and a composition step for shifting the
robocentric frames [26], which are … Square-root robocentric visual-inertial odometry with online …

A self-supervised, differentiable Kalman filter for uncertainty-aware visual-inertial odometry

B Wagstaff, E Wise, J Kelly - 2022 IEEE/ASME International …, 2022 - ieeexplore.ieee.org
… In Section III-C, we review the robocentric EKF for VIO. Finally, in Section III-D, we present …
, the camera reference frame at time k, the robocentric reference frame at time k, and the IMU …