AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch

Max Yang, chenghua lu, Alex Church, Yijiong Lin, Christopher J. Ford, Haoran Li, Efi Psomopoulou, David A.W. Barton, Nathan F. Lepora
Proceedings of The 8th Conference on Robot Learning, PMLR 270:4727-4747, 2025.

Abstract

Human hands are capable of in-hand manipulation in the presence of different hand motions. For a robot hand, harnessing rich tactile information to achieve this level of dexterity still remains a significant challenge. In this paper, we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer. Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction. In our experiments, we highlight the benefit of capturing detailed contact information when handling objects of varying properties. Interestingly, we found rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v270-yang25c, title = {AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch}, author = {Yang, Max and lu, chenghua and Church, Alex and Lin, Yijiong and Ford, Christopher J. and Li, Haoran and Psomopoulou, Efi and Barton, David A.W. and Lepora, Nathan F.}, booktitle = {Proceedings of The 8th Conference on Robot Learning}, pages = {4727--4747}, year = {2025}, editor = {Agrawal, Pulkit and Kroemer, Oliver and Burgard, Wolfram}, volume = {270}, series = {Proceedings of Machine Learning Research}, month = {06--09 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v270/main/assets/yang25c/yang25c.pdf}, url = {https://proceedings.mlr.press/v270/yang25c.html}, abstract = {Human hands are capable of in-hand manipulation in the presence of different hand motions. For a robot hand, harnessing rich tactile information to achieve this level of dexterity still remains a significant challenge. In this paper, we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer. Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction. In our experiments, we highlight the benefit of capturing detailed contact information when handling objects of varying properties. Interestingly, we found rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.} }
Endnote
%0 Conference Paper %T AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch %A Max Yang %A chenghua lu %A Alex Church %A Yijiong Lin %A Christopher J. Ford %A Haoran Li %A Efi Psomopoulou %A David A.W. Barton %A Nathan F. Lepora %B Proceedings of The 8th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2025 %E Pulkit Agrawal %E Oliver Kroemer %E Wolfram Burgard %F pmlr-v270-yang25c %I PMLR %P 4727--4747 %U https://proceedings.mlr.press/v270/yang25c.html %V 270 %X Human hands are capable of in-hand manipulation in the presence of different hand motions. For a robot hand, harnessing rich tactile information to achieve this level of dexterity still remains a significant challenge. In this paper, we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer. Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction. In our experiments, we highlight the benefit of capturing detailed contact information when handling objects of varying properties. Interestingly, we found rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.
APA
Yang, M., lu, c., Church, A., Lin, Y., Ford, C.J., Li, H., Psomopoulou, E., Barton, D.A. & Lepora, N.F.. (2025). AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch. Proceedings of The 8th Conference on Robot Learning, in Proceedings of Machine Learning Research 270:4727-4747 Available from https://proceedings.mlr.press/v270/yang25c.html.

Related Material