MUSE: A Real-Time Multi-Sensor State Estimator for Quadruped Robots

被引:0
作者
Nistico, Ylenia [1 ,2 ]
Soares, Joao Carlos Virgolino [1 ]
Amatucci, Lorenzo [1 ,2 ]
Fink, Geoff [1 ,3 ]
Semini, Claudio [1 ]
机构
[1] Ist Italiano Tecnol IIT, Dynam Legged Syst DLS, I-16163 Genoa, Italy
[2] Univ Genoa, Dipartimento Informat Bioingn Robot & Ingn Sistemi, I-16126 Genoa, Italy
[3] Thompson Rivers Univ, Dept Engn, Kamloops, BC V2C 0C8, Canada
来源
IEEE ROBOTICS AND AUTOMATION LETTERS | 2025年 / 10卷 / 05期
关键词
Robots; Sensors; Robot sensing systems; Legged locomotion; Odometry; Cameras; Laser radar; Robot vision systems; Robot kinematics; Quadrupedal robots; State estimation; localization; sensor fusion; quadruped robots;
D O I
10.1109/LRA.2025.3553047
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This letter introduces an innovative state estimator, MUSE (MUlti-sensor State Estimator), designed to enhance state estimation's accuracy and real-time performance in quadruped robot navigation. The proposed state estimator builds upon our previous work presented in (Fink et al. 2020). It integrates data from a range of onboard sensors, including IMUs, encoders, cameras, and LiDARs, to deliver a comprehensive and reliable estimation of the robot's pose and motion, even in slippery scenarios. We tested MUSE on a Unitree Aliengo robot, successfully closing the locomotion control loop in difficult scenarios, including slippery and uneven terrain. Benchmarking against Pronto (Camurri et al. 2020) and VILENS (Wisth et al. 2022) showed 67.6% and 26.7% reductions in translational errors, respectively. Additionally, MUSE outperformed DLIO (Chen et al. 2023), a LiDAR-inertial odometry system in rotational errors and frequency, while the proprioceptive version of MUSE (P-MUSE) outperformed TSIF [Bloesch et al. 2018], with a 45.9% reduction in absolute trajectory error (ATE).
引用
收藏
页码:4620 / 4627
页数:8
相关论文
共 31 条
[1]  
Amatucci L, 2024, IEEE INT C INT ROBOT, P12734, DOI 10.1109/IROS58592.2024.10801676
[2]   On Autonomous Spatial Exploration with Small Hexapod Walking Robot using Tracking Camera Intel RealSense T265 [J].
Bayer, Jan ;
Faigl, Jan .
2019 EUROPEAN CONFERENCE ON MOBILE ROBOTS (ECMR), 2019,
[3]  
Bloesch M., 2013, Robotics, V17, P17, DOI [10.15607/RSS.2012.VIII.003, DOI 10.15607/RSS.2012.VIII.003]
[4]   The Two-State Implicit Filter Recursive Estimation for Mobile Robots [J].
Bloesch, Michael ;
Burri, Michael ;
Sommer, Hannes ;
Siegwart, Roland .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (01) :573-580
[5]  
Bloesch M, 2013, IEEE INT C INT ROBOT, P6058, DOI 10.1109/IROS.2013.6697236
[6]   Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age [J].
Cadena, Cesar ;
Carlone, Luca ;
Carrillo, Henry ;
Latif, Yasir ;
Scaramuzza, Davide ;
Neira, Jose ;
Reid, Ian ;
Leonard, John J. .
IEEE TRANSACTIONS ON ROBOTICS, 2016, 32 (06) :1309-1332
[7]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[8]   Pronto: A Multi-Sensor State Estimator for Legged Robots in Real-World Scenarios [J].
Camurri, Marco ;
Ramezani, Milad ;
Nobili, Simona ;
Fallon, Maurice .
FRONTIERS IN ROBOTICS AND AI, 2020, 7
[9]   Direct LiDAR-Inertial Odometry: Lightweight LIO with Continuous-Time Motion Correction [J].
Chen, Kenny ;
Nemiroff, Ryan ;
Lopez, Brett T. .
2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, :3983-3989
[10]   On State Estimation for Legged Locomotion over Soft Terrain [J].
Fahmi S. ;
Fink G. ;
Semini C. .
IEEE Sensors Letters, 2021, 5 (01) :1-4