Towards a Formal Theory of Representational Compositionality

Eric Elmoznino, Thomas Jiralerspong, Yoshua Bengio, Guillaume Lajoie
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:15266-15295, 2025.

Abstract

Compositionality is believed to be fundamental to intelligence. In humans, it underlies the structure of thought and language. In AI, it enables a powerful form of out-of-distribution generalization, in which a model systematically adapts to novel combinations of known concepts. However, while we have strong intuitions about what compositionality is, we lack satisfying formal definitions for it. Here, we propose such a definition called representational compositionality that is conceptually simple, quantitative, and grounded in algorithmic information theory. Intuitively, representational compositionality states that a compositional representation is both expressive and describable as a simple function of parts. We validate our definition on both real and synthetic data, and show how it unifies disparate intuitions from across the literature in both AI and cognitive science. We hope that our definition can inspire the design of novel, theoretically-driven models that better capture the mechanisms of compositional thought. We make our code available at https://github.com/EricElmoznino/complexity_compositionality.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-elmoznino25a, title = {Towards a Formal Theory of Representational Compositionality}, author = {Elmoznino, Eric and Jiralerspong, Thomas and Bengio, Yoshua and Lajoie, Guillaume}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {15266--15295}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/elmoznino25a/elmoznino25a.pdf}, url = {https://proceedings.mlr.press/v267/elmoznino25a.html}, abstract = {Compositionality is believed to be fundamental to intelligence. In humans, it underlies the structure of thought and language. In AI, it enables a powerful form of out-of-distribution generalization, in which a model systematically adapts to novel combinations of known concepts. However, while we have strong intuitions about what compositionality is, we lack satisfying formal definitions for it. Here, we propose such a definition called representational compositionality that is conceptually simple, quantitative, and grounded in algorithmic information theory. Intuitively, representational compositionality states that a compositional representation is both expressive and describable as a simple function of parts. We validate our definition on both real and synthetic data, and show how it unifies disparate intuitions from across the literature in both AI and cognitive science. We hope that our definition can inspire the design of novel, theoretically-driven models that better capture the mechanisms of compositional thought. We make our code available at https://github.com/EricElmoznino/complexity_compositionality.} }
Endnote
%0 Conference Paper %T Towards a Formal Theory of Representational Compositionality %A Eric Elmoznino %A Thomas Jiralerspong %A Yoshua Bengio %A Guillaume Lajoie %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-elmoznino25a %I PMLR %P 15266--15295 %U https://proceedings.mlr.press/v267/elmoznino25a.html %V 267 %X Compositionality is believed to be fundamental to intelligence. In humans, it underlies the structure of thought and language. In AI, it enables a powerful form of out-of-distribution generalization, in which a model systematically adapts to novel combinations of known concepts. However, while we have strong intuitions about what compositionality is, we lack satisfying formal definitions for it. Here, we propose such a definition called representational compositionality that is conceptually simple, quantitative, and grounded in algorithmic information theory. Intuitively, representational compositionality states that a compositional representation is both expressive and describable as a simple function of parts. We validate our definition on both real and synthetic data, and show how it unifies disparate intuitions from across the literature in both AI and cognitive science. We hope that our definition can inspire the design of novel, theoretically-driven models that better capture the mechanisms of compositional thought. We make our code available at https://github.com/EricElmoznino/complexity_compositionality.
APA
Elmoznino, E., Jiralerspong, T., Bengio, Y. & Lajoie, G.. (2025). Towards a Formal Theory of Representational Compositionality. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:15266-15295 Available from https://proceedings.mlr.press/v267/elmoznino25a.html.

Related Material