Learning to See Physical Properties with Active Sensing Motor Policies

Gabriel B. Margolis, Xiang Fu, Yandong Ji, Pulkit Agrawal
Proceedings of The 7th Conference on Robot Learning, PMLR 229:2537-2548, 2023.

Abstract

To plan efficient robot locomotion, we must use the information about a terrain’s physics that can be inferred from color images. To this end, we train a visual perception module that predicts terrain properties using labels from a small amount of real-world proprioceptive locomotion. To ensure label precision, we introduce Active Sensing Motor Policies (ASMP). These policies are trained to prefer motor skills that facilitate accurately estimating the environment’s physics, like swiping a foot to observe friction. The estimated labels supervise a vision model that infers physical properties directly from color images and can be reused for different tasks. Leveraging a pretrained vision backbone, we demonstrate robust generalization in image space, enabling path planning from overhead imagery despite using only ground camera images for training.

Cite this Paper


BibTeX
@InProceedings{pmlr-v229-margolis23a, title = {Learning to See Physical Properties with Active Sensing Motor Policies}, author = {Margolis, Gabriel B. and Fu, Xiang and Ji, Yandong and Agrawal, Pulkit}, booktitle = {Proceedings of The 7th Conference on Robot Learning}, pages = {2537--2548}, year = {2023}, editor = {Tan, Jie and Toussaint, Marc and Darvish, Kourosh}, volume = {229}, series = {Proceedings of Machine Learning Research}, month = {06--09 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v229/margolis23a/margolis23a.pdf}, url = {https://proceedings.mlr.press/v229/margolis23a.html}, abstract = {To plan efficient robot locomotion, we must use the information about a terrain’s physics that can be inferred from color images. To this end, we train a visual perception module that predicts terrain properties using labels from a small amount of real-world proprioceptive locomotion. To ensure label precision, we introduce Active Sensing Motor Policies (ASMP). These policies are trained to prefer motor skills that facilitate accurately estimating the environment’s physics, like swiping a foot to observe friction. The estimated labels supervise a vision model that infers physical properties directly from color images and can be reused for different tasks. Leveraging a pretrained vision backbone, we demonstrate robust generalization in image space, enabling path planning from overhead imagery despite using only ground camera images for training.} }
Endnote
%0 Conference Paper %T Learning to See Physical Properties with Active Sensing Motor Policies %A Gabriel B. Margolis %A Xiang Fu %A Yandong Ji %A Pulkit Agrawal %B Proceedings of The 7th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2023 %E Jie Tan %E Marc Toussaint %E Kourosh Darvish %F pmlr-v229-margolis23a %I PMLR %P 2537--2548 %U https://proceedings.mlr.press/v229/margolis23a.html %V 229 %X To plan efficient robot locomotion, we must use the information about a terrain’s physics that can be inferred from color images. To this end, we train a visual perception module that predicts terrain properties using labels from a small amount of real-world proprioceptive locomotion. To ensure label precision, we introduce Active Sensing Motor Policies (ASMP). These policies are trained to prefer motor skills that facilitate accurately estimating the environment’s physics, like swiping a foot to observe friction. The estimated labels supervise a vision model that infers physical properties directly from color images and can be reused for different tasks. Leveraging a pretrained vision backbone, we demonstrate robust generalization in image space, enabling path planning from overhead imagery despite using only ground camera images for training.
APA
Margolis, G.B., Fu, X., Ji, Y. & Agrawal, P.. (2023). Learning to See Physical Properties with Active Sensing Motor Policies. Proceedings of The 7th Conference on Robot Learning, in Proceedings of Machine Learning Research 229:2537-2548 Available from https://proceedings.mlr.press/v229/margolis23a.html.

Related Material