Learning Portable Representations for High-Level Planning

Steven James, Benjamin Rosman, George Konidaris
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4682-4691, 2020.

Abstract

We present a framework for autonomously learning a portable representation that describes a collection of low-level continuous environments. We show that these abstract representations can be learned in a task-independent egocentric space specific to the agent that, when grounded with problem-specific information, are provably sufficient for planning. We demonstrate transfer in two different domains, where an agent learns a portable, task-independent symbolic vocabulary, as well as operators expressed in that vocabulary, and then learns to instantiate those operators on a per-task basis. This reduces the number of samples required to learn a representation of a new task.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-james20a, title = {Learning Portable Representations for High-Level Planning}, author = {James, Steven and Rosman, Benjamin and Konidaris, George}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4682--4691}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/james20a/james20a.pdf}, url = {https://proceedings.mlr.press/v119/james20a.html}, abstract = {We present a framework for autonomously learning a portable representation that describes a collection of low-level continuous environments. We show that these abstract representations can be learned in a task-independent egocentric space specific to the agent that, when grounded with problem-specific information, are provably sufficient for planning. We demonstrate transfer in two different domains, where an agent learns a portable, task-independent symbolic vocabulary, as well as operators expressed in that vocabulary, and then learns to instantiate those operators on a per-task basis. This reduces the number of samples required to learn a representation of a new task.} }
Endnote
%0 Conference Paper %T Learning Portable Representations for High-Level Planning %A Steven James %A Benjamin Rosman %A George Konidaris %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-james20a %I PMLR %P 4682--4691 %U https://proceedings.mlr.press/v119/james20a.html %V 119 %X We present a framework for autonomously learning a portable representation that describes a collection of low-level continuous environments. We show that these abstract representations can be learned in a task-independent egocentric space specific to the agent that, when grounded with problem-specific information, are provably sufficient for planning. We demonstrate transfer in two different domains, where an agent learns a portable, task-independent symbolic vocabulary, as well as operators expressed in that vocabulary, and then learns to instantiate those operators on a per-task basis. This reduces the number of samples required to learn a representation of a new task.
APA
James, S., Rosman, B. & Konidaris, G.. (2020). Learning Portable Representations for High-Level Planning. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4682-4691 Available from https://proceedings.mlr.press/v119/james20a.html.

Related Material