Masking by Moving: Learning Distraction-Free Radar Odometry from Pose Information

Dan Barnes, Rob Weston, Ingmar Posner
Proceedings of the Conference on Robot Learning, PMLR 100:303-316, 2020.

Abstract

This paper presents an end-to-end radar odometry system which delivers robust, real-time pose estimates based on a learned embedding space free of sensing artefacts and distractor objects. The system deploys a fully differentiable, correlation-based radar matching approach. This provides the same level of interpretability as established scan-matching methods and allows for a principled derivation of uncertainty estimates. The system is trained in a (self-)supervised way using only previously obtained pose information as a training signal. Using 280km of urban driving data, we demonstrate that our approach outperforms the previous state-of-the-art in radar odometry by reducing errors by up 68% whilst running an order of magnitude faster.

Cite this Paper


BibTeX
@InProceedings{pmlr-v100-barnes20a, title = {Masking by Moving: Learning Distraction-Free Radar Odometry from Pose Information}, author = {Barnes, Dan and Weston, Rob and Posner, Ingmar}, booktitle = {Proceedings of the Conference on Robot Learning}, pages = {303--316}, year = {2020}, editor = {Kaelbling, Leslie Pack and Kragic, Danica and Sugiura, Komei}, volume = {100}, series = {Proceedings of Machine Learning Research}, month = {30 Oct--01 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v100/barnes20a/barnes20a.pdf}, url = {https://proceedings.mlr.press/v100/barnes20a.html}, abstract = {This paper presents an end-to-end radar odometry system which delivers robust, real-time pose estimates based on a learned embedding space free of sensing artefacts and distractor objects. The system deploys a fully differentiable, correlation-based radar matching approach. This provides the same level of interpretability as established scan-matching methods and allows for a principled derivation of uncertainty estimates. The system is trained in a (self-)supervised way using only previously obtained pose information as a training signal. Using 280km of urban driving data, we demonstrate that our approach outperforms the previous state-of-the-art in radar odometry by reducing errors by up 68% whilst running an order of magnitude faster.} }
Endnote
%0 Conference Paper %T Masking by Moving: Learning Distraction-Free Radar Odometry from Pose Information %A Dan Barnes %A Rob Weston %A Ingmar Posner %B Proceedings of the Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2020 %E Leslie Pack Kaelbling %E Danica Kragic %E Komei Sugiura %F pmlr-v100-barnes20a %I PMLR %P 303--316 %U https://proceedings.mlr.press/v100/barnes20a.html %V 100 %X This paper presents an end-to-end radar odometry system which delivers robust, real-time pose estimates based on a learned embedding space free of sensing artefacts and distractor objects. The system deploys a fully differentiable, correlation-based radar matching approach. This provides the same level of interpretability as established scan-matching methods and allows for a principled derivation of uncertainty estimates. The system is trained in a (self-)supervised way using only previously obtained pose information as a training signal. Using 280km of urban driving data, we demonstrate that our approach outperforms the previous state-of-the-art in radar odometry by reducing errors by up 68% whilst running an order of magnitude faster.
APA
Barnes, D., Weston, R. & Posner, I.. (2020). Masking by Moving: Learning Distraction-Free Radar Odometry from Pose Information. Proceedings of the Conference on Robot Learning, in Proceedings of Machine Learning Research 100:303-316 Available from https://proceedings.mlr.press/v100/barnes20a.html.

Related Material