ACE: A Cross-platform and visual-Exoskeletons System for Low-Cost Dexterous Teleoperation

Shiqi Yang, Minghuan Liu, Yuzhe Qin, Runyu Ding, Jialong Li, Xuxin Cheng, Ruihan Yang, Sha Yi, Xiaolong Wang
Proceedings of The 8th Conference on Robot Learning, PMLR 270:4895-4911, 2025.

Abstract

Bimanual robotic manipulation with dexterous hands has a large potential workability and a wide workspace as it follows the most natural human workflow. Learning from human demonstrations has proven highly effective for learning a dexterous manipulation policy. To collect such data, teleoperation serves as a straightforward and efficient way to do so. However, a cost-effective and easy-to-use teleoperation system is lacking for anthropomorphic robot hands. To fill the deficiency, we developed \our, a cross-platform visual-exoskeleton system for low-cost dexterous teleoperation. Our system employs a hand-facing camera to capture 3D hand poses and an exoskeleton mounted on a base that can be easily carried on users’ backs. ACE captures both the hand root end-effector and hand pose in real-time and enables cross-platform operations. We evaluate the key system parameters compared with previous teleoperation systems and show clear advantages of \our. We then showcase the desktop and mobile versions of our system on six different robot platforms (including humanoid-hands, arm-hands, arm-gripper, and quadruped-gripper systems), and demonstrate the effectiveness of learning three difficult real-world tasks through the collected demonstration on two of them.

Cite this Paper


BibTeX
@InProceedings{pmlr-v270-yang25d, title = {ACE: A Cross-platform and visual-Exoskeletons System for Low-Cost Dexterous Teleoperation}, author = {Yang, Shiqi and Liu, Minghuan and Qin, Yuzhe and Ding, Runyu and Li, Jialong and Cheng, Xuxin and Yang, Ruihan and Yi, Sha and Wang, Xiaolong}, booktitle = {Proceedings of The 8th Conference on Robot Learning}, pages = {4895--4911}, year = {2025}, editor = {Agrawal, Pulkit and Kroemer, Oliver and Burgard, Wolfram}, volume = {270}, series = {Proceedings of Machine Learning Research}, month = {06--09 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v270/main/assets/yang25d/yang25d.pdf}, url = {https://proceedings.mlr.press/v270/yang25d.html}, abstract = {Bimanual robotic manipulation with dexterous hands has a large potential workability and a wide workspace as it follows the most natural human workflow. Learning from human demonstrations has proven highly effective for learning a dexterous manipulation policy. To collect such data, teleoperation serves as a straightforward and efficient way to do so. However, a cost-effective and easy-to-use teleoperation system is lacking for anthropomorphic robot hands. To fill the deficiency, we developed \our, a cross-platform visual-exoskeleton system for low-cost dexterous teleoperation. Our system employs a hand-facing camera to capture 3D hand poses and an exoskeleton mounted on a base that can be easily carried on users’ backs. ACE captures both the hand root end-effector and hand pose in real-time and enables cross-platform operations. We evaluate the key system parameters compared with previous teleoperation systems and show clear advantages of \our. We then showcase the desktop and mobile versions of our system on six different robot platforms (including humanoid-hands, arm-hands, arm-gripper, and quadruped-gripper systems), and demonstrate the effectiveness of learning three difficult real-world tasks through the collected demonstration on two of them.} }
Endnote
%0 Conference Paper %T ACE: A Cross-platform and visual-Exoskeletons System for Low-Cost Dexterous Teleoperation %A Shiqi Yang %A Minghuan Liu %A Yuzhe Qin %A Runyu Ding %A Jialong Li %A Xuxin Cheng %A Ruihan Yang %A Sha Yi %A Xiaolong Wang %B Proceedings of The 8th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2025 %E Pulkit Agrawal %E Oliver Kroemer %E Wolfram Burgard %F pmlr-v270-yang25d %I PMLR %P 4895--4911 %U https://proceedings.mlr.press/v270/yang25d.html %V 270 %X Bimanual robotic manipulation with dexterous hands has a large potential workability and a wide workspace as it follows the most natural human workflow. Learning from human demonstrations has proven highly effective for learning a dexterous manipulation policy. To collect such data, teleoperation serves as a straightforward and efficient way to do so. However, a cost-effective and easy-to-use teleoperation system is lacking for anthropomorphic robot hands. To fill the deficiency, we developed \our, a cross-platform visual-exoskeleton system for low-cost dexterous teleoperation. Our system employs a hand-facing camera to capture 3D hand poses and an exoskeleton mounted on a base that can be easily carried on users’ backs. ACE captures both the hand root end-effector and hand pose in real-time and enables cross-platform operations. We evaluate the key system parameters compared with previous teleoperation systems and show clear advantages of \our. We then showcase the desktop and mobile versions of our system on six different robot platforms (including humanoid-hands, arm-hands, arm-gripper, and quadruped-gripper systems), and demonstrate the effectiveness of learning three difficult real-world tasks through the collected demonstration on two of them.
APA
Yang, S., Liu, M., Qin, Y., Ding, R., Li, J., Cheng, X., Yang, R., Yi, S. & Wang, X.. (2025). ACE: A Cross-platform and visual-Exoskeletons System for Low-Cost Dexterous Teleoperation. Proceedings of The 8th Conference on Robot Learning, in Proceedings of Machine Learning Research 270:4895-4911 Available from https://proceedings.mlr.press/v270/yang25d.html.

Related Material