Real-time Mapping of Physical Scene Properties with an Autonomous Robot Experimenter

Iain Haughton, Edgar Sucar, Andre Mouton, Edward Johns, Andrew Davison
Proceedings of The 6th Conference on Robot Learning, PMLR 205:118-127, 2023.

Abstract

Neural fields can be trained from scratch to represent the shape and appearance of 3D scenes efficiently. It has also been shown that they can densely map correlated properties such as semantics, via sparse interactions from a human labeller. In this work, we show that a robot can densely annotate a scene with arbitrary discrete or continuous physical properties via its own fully-autonomous experimental interactions, as it simultaneously scans and maps it with an RGB-D camera. A variety of scene interactions are possible, including poking with force sensing to determine rigidity, measuring local material type with single-pixel spectroscopy or predicting force distributions by pushing. Sparse experimental interactions are guided by entropy to enable high efficiency, with tabletop scene properties densely mapped from scratch in a few minutes from a few tens of interactions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v205-haughton23a, title = {Real-time Mapping of Physical Scene Properties with an Autonomous Robot Experimenter}, author = {Haughton, Iain and Sucar, Edgar and Mouton, Andre and Johns, Edward and Davison, Andrew}, booktitle = {Proceedings of The 6th Conference on Robot Learning}, pages = {118--127}, year = {2023}, editor = {Liu, Karen and Kulic, Dana and Ichnowski, Jeff}, volume = {205}, series = {Proceedings of Machine Learning Research}, month = {14--18 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v205/haughton23a/haughton23a.pdf}, url = {https://proceedings.mlr.press/v205/haughton23a.html}, abstract = {Neural fields can be trained from scratch to represent the shape and appearance of 3D scenes efficiently. It has also been shown that they can densely map correlated properties such as semantics, via sparse interactions from a human labeller. In this work, we show that a robot can densely annotate a scene with arbitrary discrete or continuous physical properties via its own fully-autonomous experimental interactions, as it simultaneously scans and maps it with an RGB-D camera. A variety of scene interactions are possible, including poking with force sensing to determine rigidity, measuring local material type with single-pixel spectroscopy or predicting force distributions by pushing. Sparse experimental interactions are guided by entropy to enable high efficiency, with tabletop scene properties densely mapped from scratch in a few minutes from a few tens of interactions.} }
Endnote
%0 Conference Paper %T Real-time Mapping of Physical Scene Properties with an Autonomous Robot Experimenter %A Iain Haughton %A Edgar Sucar %A Andre Mouton %A Edward Johns %A Andrew Davison %B Proceedings of The 6th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2023 %E Karen Liu %E Dana Kulic %E Jeff Ichnowski %F pmlr-v205-haughton23a %I PMLR %P 118--127 %U https://proceedings.mlr.press/v205/haughton23a.html %V 205 %X Neural fields can be trained from scratch to represent the shape and appearance of 3D scenes efficiently. It has also been shown that they can densely map correlated properties such as semantics, via sparse interactions from a human labeller. In this work, we show that a robot can densely annotate a scene with arbitrary discrete or continuous physical properties via its own fully-autonomous experimental interactions, as it simultaneously scans and maps it with an RGB-D camera. A variety of scene interactions are possible, including poking with force sensing to determine rigidity, measuring local material type with single-pixel spectroscopy or predicting force distributions by pushing. Sparse experimental interactions are guided by entropy to enable high efficiency, with tabletop scene properties densely mapped from scratch in a few minutes from a few tens of interactions.
APA
Haughton, I., Sucar, E., Mouton, A., Johns, E. & Davison, A.. (2023). Real-time Mapping of Physical Scene Properties with an Autonomous Robot Experimenter. Proceedings of The 6th Conference on Robot Learning, in Proceedings of Machine Learning Research 205:118-127 Available from https://proceedings.mlr.press/v205/haughton23a.html.

Related Material