Legolas: Deep Leg-Inertial Odometry

Justin Wasserman, Ananye Agarwal, Rishabh Jangir, Girish Chowdhary, Deepak Pathak, Abhinav Gupta
Proceedings of The 8th Conference on Robot Learning, PMLR 270:2928-2947, 2025.

Abstract

Estimating odometry, where an accumulating position and rotation is tracked, has critical applications in many areas of robotics as a form of state estimation such as in SLAM, navigation, and controls. During deployment of a legged robot, a vision system’s tracking can easily get lost. Instead, using only the onboard leg and inertial sensor for odometry is a promising alternative. Previous methods in estimating leg-inertial odometry require analytical modeling or collecting high-quality real-world trajectories to train a model. Analytical modeling is specific to each robot, requires manual fine-tuning, and doesn’t always capture real-world phenomena such as slippage. Previous work learning legged odometry still relies on collecting real-world data, this has been shown to not perform well out of distribution. In this work, we show that it is possible to estimate the odometry of a legged robot without any analytical modeling or real-world data collection. In this paper, we present Legolas, the first method that accurately estimates odometry in a purely data-driven fashion for quadruped robots. We deploy our method on two real-world quadruped robots in both indoor and outdoor environments. In the indoor scenes, our proposed method accomplishes a relative pose error that is 73% less than an analytical filtering-based approach and 87.5% less than a real-world behavioral cloning approach. More results are available at: learned-odom.github.io

Cite this Paper


BibTeX
@InProceedings{pmlr-v270-wasserman25a, title = {Legolas: Deep Leg-Inertial Odometry}, author = {Wasserman, Justin and Agarwal, Ananye and Jangir, Rishabh and Chowdhary, Girish and Pathak, Deepak and Gupta, Abhinav}, booktitle = {Proceedings of The 8th Conference on Robot Learning}, pages = {2928--2947}, year = {2025}, editor = {Agrawal, Pulkit and Kroemer, Oliver and Burgard, Wolfram}, volume = {270}, series = {Proceedings of Machine Learning Research}, month = {06--09 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v270/main/assets/wasserman25a/wasserman25a.pdf}, url = {https://proceedings.mlr.press/v270/wasserman25a.html}, abstract = {Estimating odometry, where an accumulating position and rotation is tracked, has critical applications in many areas of robotics as a form of state estimation such as in SLAM, navigation, and controls. During deployment of a legged robot, a vision system’s tracking can easily get lost. Instead, using only the onboard leg and inertial sensor for odometry is a promising alternative. Previous methods in estimating leg-inertial odometry require analytical modeling or collecting high-quality real-world trajectories to train a model. Analytical modeling is specific to each robot, requires manual fine-tuning, and doesn’t always capture real-world phenomena such as slippage. Previous work learning legged odometry still relies on collecting real-world data, this has been shown to not perform well out of distribution. In this work, we show that it is possible to estimate the odometry of a legged robot without any analytical modeling or real-world data collection. In this paper, we present Legolas, the first method that accurately estimates odometry in a purely data-driven fashion for quadruped robots. We deploy our method on two real-world quadruped robots in both indoor and outdoor environments. In the indoor scenes, our proposed method accomplishes a relative pose error that is 73% less than an analytical filtering-based approach and 87.5% less than a real-world behavioral cloning approach. More results are available at: learned-odom.github.io} }
Endnote
%0 Conference Paper %T Legolas: Deep Leg-Inertial Odometry %A Justin Wasserman %A Ananye Agarwal %A Rishabh Jangir %A Girish Chowdhary %A Deepak Pathak %A Abhinav Gupta %B Proceedings of The 8th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2025 %E Pulkit Agrawal %E Oliver Kroemer %E Wolfram Burgard %F pmlr-v270-wasserman25a %I PMLR %P 2928--2947 %U https://proceedings.mlr.press/v270/wasserman25a.html %V 270 %X Estimating odometry, where an accumulating position and rotation is tracked, has critical applications in many areas of robotics as a form of state estimation such as in SLAM, navigation, and controls. During deployment of a legged robot, a vision system’s tracking can easily get lost. Instead, using only the onboard leg and inertial sensor for odometry is a promising alternative. Previous methods in estimating leg-inertial odometry require analytical modeling or collecting high-quality real-world trajectories to train a model. Analytical modeling is specific to each robot, requires manual fine-tuning, and doesn’t always capture real-world phenomena such as slippage. Previous work learning legged odometry still relies on collecting real-world data, this has been shown to not perform well out of distribution. In this work, we show that it is possible to estimate the odometry of a legged robot without any analytical modeling or real-world data collection. In this paper, we present Legolas, the first method that accurately estimates odometry in a purely data-driven fashion for quadruped robots. We deploy our method on two real-world quadruped robots in both indoor and outdoor environments. In the indoor scenes, our proposed method accomplishes a relative pose error that is 73% less than an analytical filtering-based approach and 87.5% less than a real-world behavioral cloning approach. More results are available at: learned-odom.github.io
APA
Wasserman, J., Agarwal, A., Jangir, R., Chowdhary, G., Pathak, D. & Gupta, A.. (2025). Legolas: Deep Leg-Inertial Odometry. Proceedings of The 8th Conference on Robot Learning, in Proceedings of Machine Learning Research 270:2928-2947 Available from https://proceedings.mlr.press/v270/wasserman25a.html.

Related Material