Primitive-Decomposed Cognitive Representations Enhancing Learning with Primitive-Decomposed Cognitive Representations

Jamie C. Macbeth
Proceedings of the First International Workshop on Self-Supervised Learning, PMLR 131:89-98, 2020.

Abstract

This paper proposes work that applies insights from meaning representation systems for in-depth natural language understanding to representations for self-supervised learning systems, which show promise in developing complex, deeply-nested symbolic structures through self-motivated exploration of their environments. The core of the representation system transforms language inputs into language-free structures that are complex combinations of conceptual primitives, forming a substrate for human-like understanding and common-sense reasoning. We focus on decomposing representations of expectation, intention, planning, and decision-making which are essential to a self-motivated learner. These meaning representations may enhance learning by enabling a rich array of mappings between new experiences and structures stored in short-term and long-term memory. We also argue that learning can be further enhanced when language interaction itself is an integral part of the environment in which the self-supervised learning agent is embedded.

Cite this Paper


BibTeX
@InProceedings{pmlr-v131-macbeth20a, title = {Primitive-Decomposed Cognitive Representations Enhancing Learning with Primitive-Decomposed Cognitive Representations}, author = {Macbeth, Jamie C.}, booktitle = {Proceedings of the First International Workshop on Self-Supervised Learning}, pages = {89--98}, year = {2020}, editor = {Minsky, Henry and Robertson, Paul and Georgeon, Olivier L. and Minsky, Milan and Shaoul, Cyrus}, volume = {131}, series = {Proceedings of Machine Learning Research}, month = {27--28 Feb}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v131/macbeth20a/macbeth20a.pdf}, url = {https://proceedings.mlr.press/v131/macbeth20a.html}, abstract = {This paper proposes work that applies insights from meaning representation systems for in-depth natural language understanding to representations for self-supervised learning systems, which show promise in developing complex, deeply-nested symbolic structures through self-motivated exploration of their environments. The core of the representation system transforms language inputs into language-free structures that are complex combinations of conceptual primitives, forming a substrate for human-like understanding and common-sense reasoning. We focus on decomposing representations of expectation, intention, planning, and decision-making which are essential to a self-motivated learner. These meaning representations may enhance learning by enabling a rich array of mappings between new experiences and structures stored in short-term and long-term memory. We also argue that learning can be further enhanced when language interaction itself is an integral part of the environment in which the self-supervised learning agent is embedded.} }
Endnote
%0 Conference Paper %T Primitive-Decomposed Cognitive Representations Enhancing Learning with Primitive-Decomposed Cognitive Representations %A Jamie C. Macbeth %B Proceedings of the First International Workshop on Self-Supervised Learning %C Proceedings of Machine Learning Research %D 2020 %E Henry Minsky %E Paul Robertson %E Olivier L. Georgeon %E Milan Minsky %E Cyrus Shaoul %F pmlr-v131-macbeth20a %I PMLR %P 89--98 %U https://proceedings.mlr.press/v131/macbeth20a.html %V 131 %X This paper proposes work that applies insights from meaning representation systems for in-depth natural language understanding to representations for self-supervised learning systems, which show promise in developing complex, deeply-nested symbolic structures through self-motivated exploration of their environments. The core of the representation system transforms language inputs into language-free structures that are complex combinations of conceptual primitives, forming a substrate for human-like understanding and common-sense reasoning. We focus on decomposing representations of expectation, intention, planning, and decision-making which are essential to a self-motivated learner. These meaning representations may enhance learning by enabling a rich array of mappings between new experiences and structures stored in short-term and long-term memory. We also argue that learning can be further enhanced when language interaction itself is an integral part of the environment in which the self-supervised learning agent is embedded.
APA
Macbeth, J.C.. (2020). Primitive-Decomposed Cognitive Representations Enhancing Learning with Primitive-Decomposed Cognitive Representations. Proceedings of the First International Workshop on Self-Supervised Learning, in Proceedings of Machine Learning Research 131:89-98 Available from https://proceedings.mlr.press/v131/macbeth20a.html.

Related Material