Tactile Object Pose Estimation from the First Touch with Geometric Contact Rendering

Maria Bauza Villalonga, Alberto Rodriguez, Bryan Lim, Eric Valls, Theo Sechopoulos
Proceedings of the 2020 Conference on Robot Learning, PMLR 155:1015-1029, 2021.

Abstract

In this paper, we present an approach to tactile pose estimation from the first touch for known objects. First, we create an object-agnostic map from real tactile observations to contact shapes. Next, for a new object with known geometry, we learn a tailored perception model completely in simulation. To do so, we simulate the contact shapes that a dense set of object poses would produce on the sensor. Then, given a new contact shape obtained from the sensor output, we match it against the pre-computed set using the object-specific embedding learned purely in simulation using contrastive learning. This results in a perception model that can localize objects from a single tactile observation. It also allows reasoning over pose distributions and including additional pose constraints coming from other perception systems or multiple contacts. We provide quantitative results for four objects. Our approach provides high accuracy pose estimations from distinctive tactile observations while regressing pose distributions to account for those contact shapes that could result from different object poses. We further extend and test our approach in multi-contact scenarios where several tactile sensors are simultaneously in contact with the object.

Cite this Paper


BibTeX
@InProceedings{pmlr-v155-villalonga21a, title = {Tactile Object Pose Estimation from the First Touch with Geometric Contact Rendering}, author = {Villalonga, Maria Bauza and Rodriguez, Alberto and Lim, Bryan and Valls, Eric and Sechopoulos, Theo}, booktitle = {Proceedings of the 2020 Conference on Robot Learning}, pages = {1015--1029}, year = {2021}, editor = {Kober, Jens and Ramos, Fabio and Tomlin, Claire}, volume = {155}, series = {Proceedings of Machine Learning Research}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v155/villalonga21a/villalonga21a.pdf}, url = {https://proceedings.mlr.press/v155/villalonga21a.html}, abstract = {In this paper, we present an approach to tactile pose estimation from the first touch for known objects. First, we create an object-agnostic map from real tactile observations to contact shapes. Next, for a new object with known geometry, we learn a tailored perception model completely in simulation. To do so, we simulate the contact shapes that a dense set of object poses would produce on the sensor. Then, given a new contact shape obtained from the sensor output, we match it against the pre-computed set using the object-specific embedding learned purely in simulation using contrastive learning. This results in a perception model that can localize objects from a single tactile observation. It also allows reasoning over pose distributions and including additional pose constraints coming from other perception systems or multiple contacts. We provide quantitative results for four objects. Our approach provides high accuracy pose estimations from distinctive tactile observations while regressing pose distributions to account for those contact shapes that could result from different object poses. We further extend and test our approach in multi-contact scenarios where several tactile sensors are simultaneously in contact with the object.} }
Endnote
%0 Conference Paper %T Tactile Object Pose Estimation from the First Touch with Geometric Contact Rendering %A Maria Bauza Villalonga %A Alberto Rodriguez %A Bryan Lim %A Eric Valls %A Theo Sechopoulos %B Proceedings of the 2020 Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2021 %E Jens Kober %E Fabio Ramos %E Claire Tomlin %F pmlr-v155-villalonga21a %I PMLR %P 1015--1029 %U https://proceedings.mlr.press/v155/villalonga21a.html %V 155 %X In this paper, we present an approach to tactile pose estimation from the first touch for known objects. First, we create an object-agnostic map from real tactile observations to contact shapes. Next, for a new object with known geometry, we learn a tailored perception model completely in simulation. To do so, we simulate the contact shapes that a dense set of object poses would produce on the sensor. Then, given a new contact shape obtained from the sensor output, we match it against the pre-computed set using the object-specific embedding learned purely in simulation using contrastive learning. This results in a perception model that can localize objects from a single tactile observation. It also allows reasoning over pose distributions and including additional pose constraints coming from other perception systems or multiple contacts. We provide quantitative results for four objects. Our approach provides high accuracy pose estimations from distinctive tactile observations while regressing pose distributions to account for those contact shapes that could result from different object poses. We further extend and test our approach in multi-contact scenarios where several tactile sensors are simultaneously in contact with the object.
APA
Villalonga, M.B., Rodriguez, A., Lim, B., Valls, E. & Sechopoulos, T.. (2021). Tactile Object Pose Estimation from the First Touch with Geometric Contact Rendering. Proceedings of the 2020 Conference on Robot Learning, in Proceedings of Machine Learning Research 155:1015-1029 Available from https://proceedings.mlr.press/v155/villalonga21a.html.

Related Material