Touching a NeRF: Leveraging Neural Radiance Fields for Tactile Sensory Data Generation

Shaohong Zhong, Alessandro Albini, Oiwi Parker Jones, Perla Maiolino, Ingmar Posner
Proceedings of The 6th Conference on Robot Learning, PMLR 205:1618-1628, 2023.

Abstract

Tactile perception is key for robotics applications such as manipulation. However, tactile data collection is time-consuming, especially when compared to vision. This limits the use of the tactile modality in machine learning solutions in robotics. In this paper, we propose a generative model to simulate realistic tactile sensory data for use in downstream tasks. Starting with easily-obtained camera images, we train Neural Radiance Fields (NeRF) for objects of interest. We then use NeRF-rendered RGB-D images as inputs to a conditional Generative Adversarial Network model (cGAN) to generate tactile images from desired orientations. We evaluate the generated data quantitatively using the Structural Similarity Index and Mean Squared Error metrics, and also using a tactile classification task both in simulation and in the real world. Results show that by augmenting a manually collected dataset, the generated data is able to increase classification accuracy by around 10%. In addition, we demonstrate that our model is able to transfer from one tactile sensor to another with a small fine-tuning dataset.

Cite this Paper


BibTeX
@InProceedings{pmlr-v205-zhong23a, title = {Touching a NeRF: Leveraging Neural Radiance Fields for Tactile Sensory Data Generation}, author = {Zhong, Shaohong and Albini, Alessandro and Jones, Oiwi Parker and Maiolino, Perla and Posner, Ingmar}, booktitle = {Proceedings of The 6th Conference on Robot Learning}, pages = {1618--1628}, year = {2023}, editor = {Liu, Karen and Kulic, Dana and Ichnowski, Jeff}, volume = {205}, series = {Proceedings of Machine Learning Research}, month = {14--18 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v205/zhong23a/zhong23a.pdf}, url = {https://proceedings.mlr.press/v205/zhong23a.html}, abstract = {Tactile perception is key for robotics applications such as manipulation. However, tactile data collection is time-consuming, especially when compared to vision. This limits the use of the tactile modality in machine learning solutions in robotics. In this paper, we propose a generative model to simulate realistic tactile sensory data for use in downstream tasks. Starting with easily-obtained camera images, we train Neural Radiance Fields (NeRF) for objects of interest. We then use NeRF-rendered RGB-D images as inputs to a conditional Generative Adversarial Network model (cGAN) to generate tactile images from desired orientations. We evaluate the generated data quantitatively using the Structural Similarity Index and Mean Squared Error metrics, and also using a tactile classification task both in simulation and in the real world. Results show that by augmenting a manually collected dataset, the generated data is able to increase classification accuracy by around 10%. In addition, we demonstrate that our model is able to transfer from one tactile sensor to another with a small fine-tuning dataset.} }
Endnote
%0 Conference Paper %T Touching a NeRF: Leveraging Neural Radiance Fields for Tactile Sensory Data Generation %A Shaohong Zhong %A Alessandro Albini %A Oiwi Parker Jones %A Perla Maiolino %A Ingmar Posner %B Proceedings of The 6th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2023 %E Karen Liu %E Dana Kulic %E Jeff Ichnowski %F pmlr-v205-zhong23a %I PMLR %P 1618--1628 %U https://proceedings.mlr.press/v205/zhong23a.html %V 205 %X Tactile perception is key for robotics applications such as manipulation. However, tactile data collection is time-consuming, especially when compared to vision. This limits the use of the tactile modality in machine learning solutions in robotics. In this paper, we propose a generative model to simulate realistic tactile sensory data for use in downstream tasks. Starting with easily-obtained camera images, we train Neural Radiance Fields (NeRF) for objects of interest. We then use NeRF-rendered RGB-D images as inputs to a conditional Generative Adversarial Network model (cGAN) to generate tactile images from desired orientations. We evaluate the generated data quantitatively using the Structural Similarity Index and Mean Squared Error metrics, and also using a tactile classification task both in simulation and in the real world. Results show that by augmenting a manually collected dataset, the generated data is able to increase classification accuracy by around 10%. In addition, we demonstrate that our model is able to transfer from one tactile sensor to another with a small fine-tuning dataset.
APA
Zhong, S., Albini, A., Jones, O.P., Maiolino, P. & Posner, I.. (2023). Touching a NeRF: Leveraging Neural Radiance Fields for Tactile Sensory Data Generation. Proceedings of The 6th Conference on Robot Learning, in Proceedings of Machine Learning Research 205:1618-1628 Available from https://proceedings.mlr.press/v205/zhong23a.html.

Related Material