Neural Symbolic Regression that scales

Luca Biggio, Tommaso Bendinelli, Alexander Neitz, Aurelien Lucchi, Giambattista Parascandolo
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:936-945, 2021.

Abstract

Symbolic equations are at the core of scientific discovery. The task of discovering the underlying equation from a set of input-output pairs is called symbolic regression. Traditionally, symbolic regression methods use hand-designed strategies that do not improve with experience. In this paper, we introduce the first symbolic regression method that leverages large scale pre-training. We procedurally generate an unbounded set of equations, and simultaneously pre-train a Transformer to predict the symbolic equation from a corresponding set of input-output-pairs. At test time, we query the model on a new set of points and use its output to guide the search for the equation. We show empirically that this approach can re-discover a set of well-known physical equations, and that it improves over time with more data and compute.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-biggio21a, title = {Neural Symbolic Regression that scales}, author = {Biggio, Luca and Bendinelli, Tommaso and Neitz, Alexander and Lucchi, Aurelien and Parascandolo, Giambattista}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {936--945}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/biggio21a/biggio21a.pdf}, url = {https://proceedings.mlr.press/v139/biggio21a.html}, abstract = {Symbolic equations are at the core of scientific discovery. The task of discovering the underlying equation from a set of input-output pairs is called symbolic regression. Traditionally, symbolic regression methods use hand-designed strategies that do not improve with experience. In this paper, we introduce the first symbolic regression method that leverages large scale pre-training. We procedurally generate an unbounded set of equations, and simultaneously pre-train a Transformer to predict the symbolic equation from a corresponding set of input-output-pairs. At test time, we query the model on a new set of points and use its output to guide the search for the equation. We show empirically that this approach can re-discover a set of well-known physical equations, and that it improves over time with more data and compute.} }
Endnote
%0 Conference Paper %T Neural Symbolic Regression that scales %A Luca Biggio %A Tommaso Bendinelli %A Alexander Neitz %A Aurelien Lucchi %A Giambattista Parascandolo %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-biggio21a %I PMLR %P 936--945 %U https://proceedings.mlr.press/v139/biggio21a.html %V 139 %X Symbolic equations are at the core of scientific discovery. The task of discovering the underlying equation from a set of input-output pairs is called symbolic regression. Traditionally, symbolic regression methods use hand-designed strategies that do not improve with experience. In this paper, we introduce the first symbolic regression method that leverages large scale pre-training. We procedurally generate an unbounded set of equations, and simultaneously pre-train a Transformer to predict the symbolic equation from a corresponding set of input-output-pairs. At test time, we query the model on a new set of points and use its output to guide the search for the equation. We show empirically that this approach can re-discover a set of well-known physical equations, and that it improves over time with more data and compute.
APA
Biggio, L., Bendinelli, T., Neitz, A., Lucchi, A. & Parascandolo, G.. (2021). Neural Symbolic Regression that scales. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:936-945 Available from https://proceedings.mlr.press/v139/biggio21a.html.

Related Material