Self-supervised perception for tactile skin covered dexterous hands

Akash Sharma, Carolina Higuera, Chaithanya Krishna Bodduluri, Zixi Liu, Taosha Fan, Tess Hellebrekers, Mike Lambeta, Byron Boots, Michael Kaess, Tingfan Wu, Francois Robert Hogan, Mustafa Mukadam
Proceedings of The 9th Conference on Robot Learning, PMLR 305:2311-2328, 2025.

Abstract

We present PercepSkin, a pre-trained encoder for magnetic skin sensors distributed across the fingertips, phalanges, and palm of a dexterous robot hand. Magnetic tactile skins offer a flexible form factor for hand-wide coverage with fast response times, in contrast to vision-based tactile sensors that are restricted to the fingertips and limited by bandwidth. Full hand tactile perception is crucial for robot dexterity. However, a lack of general-purpose models, challenges with interpreting magnetic flux and calibration have limited the adoption of these sensors. PercepSkin, given a history of kinematic and tactile sensing across a hand, outputs a latent tactile embedding that can be used in any downstream task. The encoder is self-supervised via self-distillation on a variety of unlabeled hand-object interactions using an Allegro hand sensorized with Xela uSkin. In experiments across several benchmark tasks, from state estimation to policy learning, we find that pretrained PercepSkin representations are both sample efficient in learning downstream tasks and improve task performance by over 41% compared to prior work and over 56% compared to end-to-end learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v305-sharma25a, title = {Self-supervised perception for tactile skin covered dexterous hands}, author = {Sharma, Akash and Higuera, Carolina and Bodduluri, Chaithanya Krishna and Liu, Zixi and Fan, Taosha and Hellebrekers, Tess and Lambeta, Mike and Boots, Byron and Kaess, Michael and Wu, Tingfan and Hogan, Francois Robert and Mukadam, Mustafa}, booktitle = {Proceedings of The 9th Conference on Robot Learning}, pages = {2311--2328}, year = {2025}, editor = {Lim, Joseph and Song, Shuran and Park, Hae-Won}, volume = {305}, series = {Proceedings of Machine Learning Research}, month = {27--30 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v305/main/assets/sharma25a/sharma25a.pdf}, url = {https://proceedings.mlr.press/v305/sharma25a.html}, abstract = {We present PercepSkin, a pre-trained encoder for magnetic skin sensors distributed across the fingertips, phalanges, and palm of a dexterous robot hand. Magnetic tactile skins offer a flexible form factor for hand-wide coverage with fast response times, in contrast to vision-based tactile sensors that are restricted to the fingertips and limited by bandwidth. Full hand tactile perception is crucial for robot dexterity. However, a lack of general-purpose models, challenges with interpreting magnetic flux and calibration have limited the adoption of these sensors. PercepSkin, given a history of kinematic and tactile sensing across a hand, outputs a latent tactile embedding that can be used in any downstream task. The encoder is self-supervised via self-distillation on a variety of unlabeled hand-object interactions using an Allegro hand sensorized with Xela uSkin. In experiments across several benchmark tasks, from state estimation to policy learning, we find that pretrained PercepSkin representations are both sample efficient in learning downstream tasks and improve task performance by over 41% compared to prior work and over 56% compared to end-to-end learning.} }
Endnote
%0 Conference Paper %T Self-supervised perception for tactile skin covered dexterous hands %A Akash Sharma %A Carolina Higuera %A Chaithanya Krishna Bodduluri %A Zixi Liu %A Taosha Fan %A Tess Hellebrekers %A Mike Lambeta %A Byron Boots %A Michael Kaess %A Tingfan Wu %A Francois Robert Hogan %A Mustafa Mukadam %B Proceedings of The 9th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2025 %E Joseph Lim %E Shuran Song %E Hae-Won Park %F pmlr-v305-sharma25a %I PMLR %P 2311--2328 %U https://proceedings.mlr.press/v305/sharma25a.html %V 305 %X We present PercepSkin, a pre-trained encoder for magnetic skin sensors distributed across the fingertips, phalanges, and palm of a dexterous robot hand. Magnetic tactile skins offer a flexible form factor for hand-wide coverage with fast response times, in contrast to vision-based tactile sensors that are restricted to the fingertips and limited by bandwidth. Full hand tactile perception is crucial for robot dexterity. However, a lack of general-purpose models, challenges with interpreting magnetic flux and calibration have limited the adoption of these sensors. PercepSkin, given a history of kinematic and tactile sensing across a hand, outputs a latent tactile embedding that can be used in any downstream task. The encoder is self-supervised via self-distillation on a variety of unlabeled hand-object interactions using an Allegro hand sensorized with Xela uSkin. In experiments across several benchmark tasks, from state estimation to policy learning, we find that pretrained PercepSkin representations are both sample efficient in learning downstream tasks and improve task performance by over 41% compared to prior work and over 56% compared to end-to-end learning.
APA
Sharma, A., Higuera, C., Bodduluri, C.K., Liu, Z., Fan, T., Hellebrekers, T., Lambeta, M., Boots, B., Kaess, M., Wu, T., Hogan, F.R. & Mukadam, M.. (2025). Self-supervised perception for tactile skin covered dexterous hands. Proceedings of The 9th Conference on Robot Learning, in Proceedings of Machine Learning Research 305:2311-2328 Available from https://proceedings.mlr.press/v305/sharma25a.html.

Related Material