Towards Learning to Detect and Predict Contact Events on Vision-based Tactile Sensors

Yazhan Zhang, Weihao Yuan, Zicheng Kan, Michael Yu Wang
Proceedings of the Conference on Robot Learning, PMLR 100:1395-1404, 2020.

Abstract

In essence, successful grasp boils down to correct responses to multiple contact events between fingertips and objects. In most scenarios, tactile sensing is adequate to distinguish contact events. Due to the nature of high dimensionality of tactile information, classifying spatiotemporal tactile signals using conventional model-based methods is difficult. In this work, we propose to predict and classify tactile signal using deep learning methods, seeking to enhance the adaptability of the robotic grasp system to external event changes that may lead to grasping failure. We develop a deep learning framework and collect 6650 tactile image sequences with a vision-based tactile sensor, and the neural network is integrated into a contact-event-based robotic grasping system. In grasping experiments, we achieved 52% increase in terms of object lifting success rate with contact detection, significantly higher robustness under unexpected loads with slip prediction compared with open-loop grasps, demonstrating that integration of the proposed framework into robotic grasping system substantially improves picking success rate and capability to withstand external disturbances.

Cite this Paper


BibTeX
@InProceedings{pmlr-v100-zhang20b, title = {Towards Learning to Detect and Predict Contact Events on Vision-based Tactile Sensors}, author = {Zhang, Yazhan and Yuan, Weihao and Kan, Zicheng and Wang, Michael Yu}, booktitle = {Proceedings of the Conference on Robot Learning}, pages = {1395--1404}, year = {2020}, editor = {Kaelbling, Leslie Pack and Kragic, Danica and Sugiura, Komei}, volume = {100}, series = {Proceedings of Machine Learning Research}, month = {30 Oct--01 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v100/zhang20b/zhang20b.pdf}, url = {https://proceedings.mlr.press/v100/zhang20b.html}, abstract = {In essence, successful grasp boils down to correct responses to multiple contact events between fingertips and objects. In most scenarios, tactile sensing is adequate to distinguish contact events. Due to the nature of high dimensionality of tactile information, classifying spatiotemporal tactile signals using conventional model-based methods is difficult. In this work, we propose to predict and classify tactile signal using deep learning methods, seeking to enhance the adaptability of the robotic grasp system to external event changes that may lead to grasping failure. We develop a deep learning framework and collect 6650 tactile image sequences with a vision-based tactile sensor, and the neural network is integrated into a contact-event-based robotic grasping system. In grasping experiments, we achieved 52% increase in terms of object lifting success rate with contact detection, significantly higher robustness under unexpected loads with slip prediction compared with open-loop grasps, demonstrating that integration of the proposed framework into robotic grasping system substantially improves picking success rate and capability to withstand external disturbances.} }
Endnote
%0 Conference Paper %T Towards Learning to Detect and Predict Contact Events on Vision-based Tactile Sensors %A Yazhan Zhang %A Weihao Yuan %A Zicheng Kan %A Michael Yu Wang %B Proceedings of the Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2020 %E Leslie Pack Kaelbling %E Danica Kragic %E Komei Sugiura %F pmlr-v100-zhang20b %I PMLR %P 1395--1404 %U https://proceedings.mlr.press/v100/zhang20b.html %V 100 %X In essence, successful grasp boils down to correct responses to multiple contact events between fingertips and objects. In most scenarios, tactile sensing is adequate to distinguish contact events. Due to the nature of high dimensionality of tactile information, classifying spatiotemporal tactile signals using conventional model-based methods is difficult. In this work, we propose to predict and classify tactile signal using deep learning methods, seeking to enhance the adaptability of the robotic grasp system to external event changes that may lead to grasping failure. We develop a deep learning framework and collect 6650 tactile image sequences with a vision-based tactile sensor, and the neural network is integrated into a contact-event-based robotic grasping system. In grasping experiments, we achieved 52% increase in terms of object lifting success rate with contact detection, significantly higher robustness under unexpected loads with slip prediction compared with open-loop grasps, demonstrating that integration of the proposed framework into robotic grasping system substantially improves picking success rate and capability to withstand external disturbances.
APA
Zhang, Y., Yuan, W., Kan, Z. & Wang, M.Y.. (2020). Towards Learning to Detect and Predict Contact Events on Vision-based Tactile Sensors. Proceedings of the Conference on Robot Learning, in Proceedings of Machine Learning Research 100:1395-1404 Available from https://proceedings.mlr.press/v100/zhang20b.html.

Related Material