Learning Continuous Semantic Representations of Symbolic Expressions

Miltiadis Allamanis, Pankajan Chanthirasegaran, Pushmeet Kohli, Charles Sutton
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:80-88, 2017.

Abstract

Combining abstract, symbolic reasoning with continuous neural reasoning is a grand challenge of representation learning. As a step in this direction, we propose a new architecture, called neural equivalence network, for the problem of learning continuous semantic representations of algebraic and logical expressions. These networks are trained to represent semantic equivalence, even of expressions that are syntactically very different. The challenge is that semantic representations must be computed in a syntax-directed manner, because semantics is compositional, but at the same time, small changes in syntax can lead to very large changes in semantics, which can be difficult for continuous neural architectures. We perform an exhaustive evaluation on the task of checking equivalence on a highly diverse class of symbolic algebraic and boolean expression types, showing that our model significantly outperforms existing architectures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-allamanis17a, title = {Learning Continuous Semantic Representations of Symbolic Expressions}, author = {Miltiadis Allamanis and Pankajan Chanthirasegaran and Pushmeet Kohli and Charles Sutton}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {80--88}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/allamanis17a/allamanis17a.pdf}, url = {https://proceedings.mlr.press/v70/allamanis17a.html}, abstract = {Combining abstract, symbolic reasoning with continuous neural reasoning is a grand challenge of representation learning. As a step in this direction, we propose a new architecture, called neural equivalence network, for the problem of learning continuous semantic representations of algebraic and logical expressions. These networks are trained to represent semantic equivalence, even of expressions that are syntactically very different. The challenge is that semantic representations must be computed in a syntax-directed manner, because semantics is compositional, but at the same time, small changes in syntax can lead to very large changes in semantics, which can be difficult for continuous neural architectures. We perform an exhaustive evaluation on the task of checking equivalence on a highly diverse class of symbolic algebraic and boolean expression types, showing that our model significantly outperforms existing architectures.} }
Endnote
%0 Conference Paper %T Learning Continuous Semantic Representations of Symbolic Expressions %A Miltiadis Allamanis %A Pankajan Chanthirasegaran %A Pushmeet Kohli %A Charles Sutton %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-allamanis17a %I PMLR %P 80--88 %U https://proceedings.mlr.press/v70/allamanis17a.html %V 70 %X Combining abstract, symbolic reasoning with continuous neural reasoning is a grand challenge of representation learning. As a step in this direction, we propose a new architecture, called neural equivalence network, for the problem of learning continuous semantic representations of algebraic and logical expressions. These networks are trained to represent semantic equivalence, even of expressions that are syntactically very different. The challenge is that semantic representations must be computed in a syntax-directed manner, because semantics is compositional, but at the same time, small changes in syntax can lead to very large changes in semantics, which can be difficult for continuous neural architectures. We perform an exhaustive evaluation on the task of checking equivalence on a highly diverse class of symbolic algebraic and boolean expression types, showing that our model significantly outperforms existing architectures.
APA
Allamanis, M., Chanthirasegaran, P., Kohli, P. & Sutton, C.. (2017). Learning Continuous Semantic Representations of Symbolic Expressions. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:80-88 Available from https://proceedings.mlr.press/v70/allamanis17a.html.

Related Material