A Neural-Guided Dynamic Symbolic Network for Exploring Mathematical Expressions from Data

Wenqiang Li, Weijun Li, Lina Yu, Min Wu, Linjun Sun, Jingyi Liu, Yanjie Li, Shu Wei, Deng Yusong, Meilan Hao
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:28222-28242, 2024.

Abstract

Symbolic regression (SR) is a powerful technique for discovering the underlying mathematical expressions from observed data. Inspired by the success of deep learning, recent deep generative SR methods have shown promising results. However, these methods face difficulties in processing high-dimensional problems and learning constants due to the large search space, and they don’t scale well to unseen problems. In this work, we propose DySymNet, a novel neural-guided Dynamic Symbolic Network for SR. Instead of searching for expressions within a large search space, we explore symbolic networks with various structures, guided by reinforcement learning, and optimize them to identify expressions that better-fitting the data. Based on extensive numerical experiments on low-dimensional public standard benchmarks and the well-known SRBench with more variables, DySymNet shows clear superiority over several representative baseline models. Open source code is available at https://github.com/AILWQ/DySymNet.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-li24ap, title = {A Neural-Guided Dynamic Symbolic Network for Exploring Mathematical Expressions from Data}, author = {Li, Wenqiang and Li, Weijun and Yu, Lina and Wu, Min and Sun, Linjun and Liu, Jingyi and Li, Yanjie and Wei, Shu and Yusong, Deng and Hao, Meilan}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {28222--28242}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/li24ap/li24ap.pdf}, url = {https://proceedings.mlr.press/v235/li24ap.html}, abstract = {Symbolic regression (SR) is a powerful technique for discovering the underlying mathematical expressions from observed data. Inspired by the success of deep learning, recent deep generative SR methods have shown promising results. However, these methods face difficulties in processing high-dimensional problems and learning constants due to the large search space, and they don’t scale well to unseen problems. In this work, we propose DySymNet, a novel neural-guided Dynamic Symbolic Network for SR. Instead of searching for expressions within a large search space, we explore symbolic networks with various structures, guided by reinforcement learning, and optimize them to identify expressions that better-fitting the data. Based on extensive numerical experiments on low-dimensional public standard benchmarks and the well-known SRBench with more variables, DySymNet shows clear superiority over several representative baseline models. Open source code is available at https://github.com/AILWQ/DySymNet.} }
Endnote
%0 Conference Paper %T A Neural-Guided Dynamic Symbolic Network for Exploring Mathematical Expressions from Data %A Wenqiang Li %A Weijun Li %A Lina Yu %A Min Wu %A Linjun Sun %A Jingyi Liu %A Yanjie Li %A Shu Wei %A Deng Yusong %A Meilan Hao %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-li24ap %I PMLR %P 28222--28242 %U https://proceedings.mlr.press/v235/li24ap.html %V 235 %X Symbolic regression (SR) is a powerful technique for discovering the underlying mathematical expressions from observed data. Inspired by the success of deep learning, recent deep generative SR methods have shown promising results. However, these methods face difficulties in processing high-dimensional problems and learning constants due to the large search space, and they don’t scale well to unseen problems. In this work, we propose DySymNet, a novel neural-guided Dynamic Symbolic Network for SR. Instead of searching for expressions within a large search space, we explore symbolic networks with various structures, guided by reinforcement learning, and optimize them to identify expressions that better-fitting the data. Based on extensive numerical experiments on low-dimensional public standard benchmarks and the well-known SRBench with more variables, DySymNet shows clear superiority over several representative baseline models. Open source code is available at https://github.com/AILWQ/DySymNet.
APA
Li, W., Li, W., Yu, L., Wu, M., Sun, L., Liu, J., Li, Y., Wei, S., Yusong, D. & Hao, M.. (2024). A Neural-Guided Dynamic Symbolic Network for Exploring Mathematical Expressions from Data. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:28222-28242 Available from https://proceedings.mlr.press/v235/li24ap.html.

Related Material