Long Range Navigator (LRN): Extending robot planning horizons beyond metric maps

Matt Schmittle, Rohan Baijal, Nathan Hatch, Rosario Scalise, Mateo Guaman Castro, Sidharth Talia, Khimya Khetarpal, Byron Boots, Siddhartha Srinivasa
Proceedings of The 9th Conference on Robot Learning, PMLR 305:3185-3199, 2025.

Abstract

A robot navigating an outdoor environment with no prior knowledge of the space must rely on its local sensing, which is in the form of a local metric map or local policy with some fixed horizon. A limited planning horizon can often result in myopic decisions leading the robot off course or worse, into very difficult terrain. In this work, we make a key observation that long range navigation only necessitates identifying good frontier directions for planning instead of full map knowledge. To address this, we introduce Long Range Navigator (LRN), which learns to predict affordable’ frontier directions from high-dimensional camera images. LRN is trained entirely on unlabeled egocentric videos, making it scalable and adaptable. In off-road tests on Spot and a large vehicle, LRN reduces human interventions and improves decision speed when integrated into existing navigation stacks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v305-schmittle25a, title = {Long Range Navigator (LRN): Extending robot planning horizons beyond metric maps}, author = {Schmittle, Matt and Baijal, Rohan and Hatch, Nathan and Scalise, Rosario and Castro, Mateo Guaman and Talia, Sidharth and Khetarpal, Khimya and Boots, Byron and Srinivasa, Siddhartha}, booktitle = {Proceedings of The 9th Conference on Robot Learning}, pages = {3185--3199}, year = {2025}, editor = {Lim, Joseph and Song, Shuran and Park, Hae-Won}, volume = {305}, series = {Proceedings of Machine Learning Research}, month = {27--30 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v305/main/assets/schmittle25a/schmittle25a.pdf}, url = {https://proceedings.mlr.press/v305/schmittle25a.html}, abstract = {A robot navigating an outdoor environment with no prior knowledge of the space must rely on its local sensing, which is in the form of a local metric map or local policy with some fixed horizon. A limited planning horizon can often result in myopic decisions leading the robot off course or worse, into very difficult terrain. In this work, we make a key observation that long range navigation only necessitates identifying good frontier directions for planning instead of full map knowledge. To address this, we introduce Long Range Navigator (LRN), which learns to predict affordable’ frontier directions from high-dimensional camera images. LRN is trained entirely on unlabeled egocentric videos, making it scalable and adaptable. In off-road tests on Spot and a large vehicle, LRN reduces human interventions and improves decision speed when integrated into existing navigation stacks.} }
Endnote
%0 Conference Paper %T Long Range Navigator (LRN): Extending robot planning horizons beyond metric maps %A Matt Schmittle %A Rohan Baijal %A Nathan Hatch %A Rosario Scalise %A Mateo Guaman Castro %A Sidharth Talia %A Khimya Khetarpal %A Byron Boots %A Siddhartha Srinivasa %B Proceedings of The 9th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2025 %E Joseph Lim %E Shuran Song %E Hae-Won Park %F pmlr-v305-schmittle25a %I PMLR %P 3185--3199 %U https://proceedings.mlr.press/v305/schmittle25a.html %V 305 %X A robot navigating an outdoor environment with no prior knowledge of the space must rely on its local sensing, which is in the form of a local metric map or local policy with some fixed horizon. A limited planning horizon can often result in myopic decisions leading the robot off course or worse, into very difficult terrain. In this work, we make a key observation that long range navigation only necessitates identifying good frontier directions for planning instead of full map knowledge. To address this, we introduce Long Range Navigator (LRN), which learns to predict affordable’ frontier directions from high-dimensional camera images. LRN is trained entirely on unlabeled egocentric videos, making it scalable and adaptable. In off-road tests on Spot and a large vehicle, LRN reduces human interventions and improves decision speed when integrated into existing navigation stacks.
APA
Schmittle, M., Baijal, R., Hatch, N., Scalise, R., Castro, M.G., Talia, S., Khetarpal, K., Boots, B. & Srinivasa, S.. (2025). Long Range Navigator (LRN): Extending robot planning horizons beyond metric maps. Proceedings of The 9th Conference on Robot Learning, in Proceedings of Machine Learning Research 305:3185-3199 Available from https://proceedings.mlr.press/v305/schmittle25a.html.

Related Material