Integrating Egocentric Localization for More Realistic Point-Goal Navigation Agents

Samyak Datta, Oleksandr Maksymets, Judy Hoffman, Stefan Lee, Dhruv Batra, Devi Parikh
Proceedings of the 2020 Conference on Robot Learning, PMLR 155:313-328, 2021.

Abstract

Recent work has presented embodied agents that can navigate to point-goal targets in novel indoor environments with near-perfect accuracy. However, these agents are equipped with idealized sensors for localization and take deterministic actions. This setting is practically sterile by comparison to the dirty reality of noisy sensors and actuations in the real world – wheels can slip, motion sensors have error, actuations can rebound. In this work, we take a step towards this noisy reality, developing point-goal navigation agents that rely on visual estimates of egomotion under noisy action dynamics. We find these agents outperform naive adaptions of current point-goal agents to this setting as well as those incorporating classic localization baselines. Further, our model conceptually divides learning agent dynamics or odometry (where am I?) from task-specific navigation policy (where do I want to go?). This enables a seamless adaption to changing dynamics (a different robot or floor type) by simply re-calibrating the visual odometry model – circumventing the expense of re-training of the navigation policy. Our agent was the runner-up in the PointNav track of CVPR 2020 Habitat Challenge.

Cite this Paper


BibTeX
@InProceedings{pmlr-v155-datta21a, title = {Integrating Egocentric Localization for More Realistic Point-Goal Navigation Agents}, author = {Datta, Samyak and Maksymets, Oleksandr and Hoffman, Judy and Lee, Stefan and Batra, Dhruv and Parikh, Devi}, booktitle = {Proceedings of the 2020 Conference on Robot Learning}, pages = {313--328}, year = {2021}, editor = {Kober, Jens and Ramos, Fabio and Tomlin, Claire}, volume = {155}, series = {Proceedings of Machine Learning Research}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v155/datta21a/datta21a.pdf}, url = {https://proceedings.mlr.press/v155/datta21a.html}, abstract = {Recent work has presented embodied agents that can navigate to point-goal targets in novel indoor environments with near-perfect accuracy. However, these agents are equipped with idealized sensors for localization and take deterministic actions. This setting is practically sterile by comparison to the dirty reality of noisy sensors and actuations in the real world – wheels can slip, motion sensors have error, actuations can rebound. In this work, we take a step towards this noisy reality, developing point-goal navigation agents that rely on visual estimates of egomotion under noisy action dynamics. We find these agents outperform naive adaptions of current point-goal agents to this setting as well as those incorporating classic localization baselines. Further, our model conceptually divides learning agent dynamics or odometry (where am I?) from task-specific navigation policy (where do I want to go?). This enables a seamless adaption to changing dynamics (a different robot or floor type) by simply re-calibrating the visual odometry model – circumventing the expense of re-training of the navigation policy. Our agent was the runner-up in the PointNav track of CVPR 2020 Habitat Challenge.} }
Endnote
%0 Conference Paper %T Integrating Egocentric Localization for More Realistic Point-Goal Navigation Agents %A Samyak Datta %A Oleksandr Maksymets %A Judy Hoffman %A Stefan Lee %A Dhruv Batra %A Devi Parikh %B Proceedings of the 2020 Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2021 %E Jens Kober %E Fabio Ramos %E Claire Tomlin %F pmlr-v155-datta21a %I PMLR %P 313--328 %U https://proceedings.mlr.press/v155/datta21a.html %V 155 %X Recent work has presented embodied agents that can navigate to point-goal targets in novel indoor environments with near-perfect accuracy. However, these agents are equipped with idealized sensors for localization and take deterministic actions. This setting is practically sterile by comparison to the dirty reality of noisy sensors and actuations in the real world – wheels can slip, motion sensors have error, actuations can rebound. In this work, we take a step towards this noisy reality, developing point-goal navigation agents that rely on visual estimates of egomotion under noisy action dynamics. We find these agents outperform naive adaptions of current point-goal agents to this setting as well as those incorporating classic localization baselines. Further, our model conceptually divides learning agent dynamics or odometry (where am I?) from task-specific navigation policy (where do I want to go?). This enables a seamless adaption to changing dynamics (a different robot or floor type) by simply re-calibrating the visual odometry model – circumventing the expense of re-training of the navigation policy. Our agent was the runner-up in the PointNav track of CVPR 2020 Habitat Challenge.
APA
Datta, S., Maksymets, O., Hoffman, J., Lee, S., Batra, D. & Parikh, D.. (2021). Integrating Egocentric Localization for More Realistic Point-Goal Navigation Agents. Proceedings of the 2020 Conference on Robot Learning, in Proceedings of Machine Learning Research 155:313-328 Available from https://proceedings.mlr.press/v155/datta21a.html.

Related Material