Translating Robot Skills: Learning Unsupervised Skill Correspondences Across Robots

Tanmay Shankar, Yixin Lin, Aravind Rajeswaran, Vikash Kumar, Stuart Anderson, Jean Oh
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:19626-19644, 2022.

Abstract

In this paper, we explore how we can endow robots with the ability to learn correspondences between their own skills, and those of morphologically different robots in different domains, in an entirely unsupervised manner. We make the insight that different morphological robots use similar task strategies to solve similar tasks. Based on this insight, we frame learning skill correspondences as a problem of matching distributions of sequences of skills across robots. We then present an unsupervised objective that encourages a learnt skill translation model to match these distributions across domains, inspired by recent advances in unsupervised machine translation. Our approach is able to learn semantically meaningful correspondences between skills across multiple robot-robot and human-robot domain pairs despite being completely unsupervised. Further, the learnt correspondences enable the transfer of task strategies across robots and domains. We present dynamic visualizations of our results at https://sites.google.com/view/translatingrobotskills/home.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-shankar22a, title = {Translating Robot Skills: Learning Unsupervised Skill Correspondences Across Robots}, author = {Shankar, Tanmay and Lin, Yixin and Rajeswaran, Aravind and Kumar, Vikash and Anderson, Stuart and Oh, Jean}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {19626--19644}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/shankar22a/shankar22a.pdf}, url = {https://proceedings.mlr.press/v162/shankar22a.html}, abstract = {In this paper, we explore how we can endow robots with the ability to learn correspondences between their own skills, and those of morphologically different robots in different domains, in an entirely unsupervised manner. We make the insight that different morphological robots use similar task strategies to solve similar tasks. Based on this insight, we frame learning skill correspondences as a problem of matching distributions of sequences of skills across robots. We then present an unsupervised objective that encourages a learnt skill translation model to match these distributions across domains, inspired by recent advances in unsupervised machine translation. Our approach is able to learn semantically meaningful correspondences between skills across multiple robot-robot and human-robot domain pairs despite being completely unsupervised. Further, the learnt correspondences enable the transfer of task strategies across robots and domains. We present dynamic visualizations of our results at https://sites.google.com/view/translatingrobotskills/home.} }
Endnote
%0 Conference Paper %T Translating Robot Skills: Learning Unsupervised Skill Correspondences Across Robots %A Tanmay Shankar %A Yixin Lin %A Aravind Rajeswaran %A Vikash Kumar %A Stuart Anderson %A Jean Oh %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-shankar22a %I PMLR %P 19626--19644 %U https://proceedings.mlr.press/v162/shankar22a.html %V 162 %X In this paper, we explore how we can endow robots with the ability to learn correspondences between their own skills, and those of morphologically different robots in different domains, in an entirely unsupervised manner. We make the insight that different morphological robots use similar task strategies to solve similar tasks. Based on this insight, we frame learning skill correspondences as a problem of matching distributions of sequences of skills across robots. We then present an unsupervised objective that encourages a learnt skill translation model to match these distributions across domains, inspired by recent advances in unsupervised machine translation. Our approach is able to learn semantically meaningful correspondences between skills across multiple robot-robot and human-robot domain pairs despite being completely unsupervised. Further, the learnt correspondences enable the transfer of task strategies across robots and domains. We present dynamic visualizations of our results at https://sites.google.com/view/translatingrobotskills/home.
APA
Shankar, T., Lin, Y., Rajeswaran, A., Kumar, V., Anderson, S. & Oh, J.. (2022). Translating Robot Skills: Learning Unsupervised Skill Correspondences Across Robots. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:19626-19644 Available from https://proceedings.mlr.press/v162/shankar22a.html.

Related Material