Learning Inertial Odometry for Dynamic Legged Robot State Estimation

Russell Buchanan, Marco Camurri, Frank Dellaert, Maurice Fallon
Proceedings of the 5th Conference on Robot Learning, PMLR 164:1575-1584, 2022.

Abstract

This paper introduces a novel proprioceptive state estimator for legged robots based on a learned displacement measurement from IMU data. Recent research in pedestrian tracking has shown that motion can be inferred from inertial data using convolutional neural networks. A learned inertial displacement measurement can improve state estimation in challenging scenarios where leg odometry is unreliable, such as slipping and compressible terrains. Our work learns to estimate a displacement measurement from IMU data which is then fused with traditional leg odometry. Our approach greatly reduces the drift of proprioceptive state estimation, which is critical for legged robots deployed in vision and lidar denied environments such as foggy sewers or dusty mines. We compared results from an EKF and an incremental fixed-lag factor graph estimator using data from several real robot experiments crossing challenging terrains. Our results show a reduction of relative pose error by 37% in challenging scenarios when compared to a traditional kinematic-inertial estimator without learned measurement. We also demonstrate a 22% reduction in error when used with vision systems in visually degraded environments such as an underground mine.

Cite this Paper


BibTeX
@InProceedings{pmlr-v164-buchanan22a, title = {Learning Inertial Odometry for Dynamic Legged Robot State Estimation}, author = {Buchanan, Russell and Camurri, Marco and Dellaert, Frank and Fallon, Maurice}, booktitle = {Proceedings of the 5th Conference on Robot Learning}, pages = {1575--1584}, year = {2022}, editor = {Faust, Aleksandra and Hsu, David and Neumann, Gerhard}, volume = {164}, series = {Proceedings of Machine Learning Research}, month = {08--11 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v164/buchanan22a/buchanan22a.pdf}, url = {https://proceedings.mlr.press/v164/buchanan22a.html}, abstract = {This paper introduces a novel proprioceptive state estimator for legged robots based on a learned displacement measurement from IMU data. Recent research in pedestrian tracking has shown that motion can be inferred from inertial data using convolutional neural networks. A learned inertial displacement measurement can improve state estimation in challenging scenarios where leg odometry is unreliable, such as slipping and compressible terrains. Our work learns to estimate a displacement measurement from IMU data which is then fused with traditional leg odometry. Our approach greatly reduces the drift of proprioceptive state estimation, which is critical for legged robots deployed in vision and lidar denied environments such as foggy sewers or dusty mines. We compared results from an EKF and an incremental fixed-lag factor graph estimator using data from several real robot experiments crossing challenging terrains. Our results show a reduction of relative pose error by 37% in challenging scenarios when compared to a traditional kinematic-inertial estimator without learned measurement. We also demonstrate a 22% reduction in error when used with vision systems in visually degraded environments such as an underground mine.} }
Endnote
%0 Conference Paper %T Learning Inertial Odometry for Dynamic Legged Robot State Estimation %A Russell Buchanan %A Marco Camurri %A Frank Dellaert %A Maurice Fallon %B Proceedings of the 5th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2022 %E Aleksandra Faust %E David Hsu %E Gerhard Neumann %F pmlr-v164-buchanan22a %I PMLR %P 1575--1584 %U https://proceedings.mlr.press/v164/buchanan22a.html %V 164 %X This paper introduces a novel proprioceptive state estimator for legged robots based on a learned displacement measurement from IMU data. Recent research in pedestrian tracking has shown that motion can be inferred from inertial data using convolutional neural networks. A learned inertial displacement measurement can improve state estimation in challenging scenarios where leg odometry is unreliable, such as slipping and compressible terrains. Our work learns to estimate a displacement measurement from IMU data which is then fused with traditional leg odometry. Our approach greatly reduces the drift of proprioceptive state estimation, which is critical for legged robots deployed in vision and lidar denied environments such as foggy sewers or dusty mines. We compared results from an EKF and an incremental fixed-lag factor graph estimator using data from several real robot experiments crossing challenging terrains. Our results show a reduction of relative pose error by 37% in challenging scenarios when compared to a traditional kinematic-inertial estimator without learned measurement. We also demonstrate a 22% reduction in error when used with vision systems in visually degraded environments such as an underground mine.
APA
Buchanan, R., Camurri, M., Dellaert, F. & Fallon, M.. (2022). Learning Inertial Odometry for Dynamic Legged Robot State Estimation. Proceedings of the 5th Conference on Robot Learning, in Proceedings of Machine Learning Research 164:1575-1584 Available from https://proceedings.mlr.press/v164/buchanan22a.html.

Related Material