From Perception to Programs: Regularize, Overparameterize, and Amortize

Hao Tang, Kevin Ellis
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:33616-33631, 2023.

Abstract

We develop techniques for synthesizing neurosymbolic programs. Such programs mix discrete symbolic processing with continuous neural computation. We relax this mixed discrete/continuous problem and jointly learn all modules with gradient descent, and also incorporate amortized inference, overparameterization, and a differentiable strategy for penalizing lengthy programs. Collectedly this toolbox improves the stability of gradient-guided program search, and suggests ways of learning both how to parse continuous input into discrete abstractions, and how to process those abstractions via symbolic code.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-tang23c, title = {From Perception to Programs: Regularize, Overparameterize, and Amortize}, author = {Tang, Hao and Ellis, Kevin}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {33616--33631}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/tang23c/tang23c.pdf}, url = {https://proceedings.mlr.press/v202/tang23c.html}, abstract = {We develop techniques for synthesizing neurosymbolic programs. Such programs mix discrete symbolic processing with continuous neural computation. We relax this mixed discrete/continuous problem and jointly learn all modules with gradient descent, and also incorporate amortized inference, overparameterization, and a differentiable strategy for penalizing lengthy programs. Collectedly this toolbox improves the stability of gradient-guided program search, and suggests ways of learning both how to parse continuous input into discrete abstractions, and how to process those abstractions via symbolic code.} }
Endnote
%0 Conference Paper %T From Perception to Programs: Regularize, Overparameterize, and Amortize %A Hao Tang %A Kevin Ellis %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-tang23c %I PMLR %P 33616--33631 %U https://proceedings.mlr.press/v202/tang23c.html %V 202 %X We develop techniques for synthesizing neurosymbolic programs. Such programs mix discrete symbolic processing with continuous neural computation. We relax this mixed discrete/continuous problem and jointly learn all modules with gradient descent, and also incorporate amortized inference, overparameterization, and a differentiable strategy for penalizing lengthy programs. Collectedly this toolbox improves the stability of gradient-guided program search, and suggests ways of learning both how to parse continuous input into discrete abstractions, and how to process those abstractions via symbolic code.
APA
Tang, H. & Ellis, K.. (2023). From Perception to Programs: Regularize, Overparameterize, and Amortize. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:33616-33631 Available from https://proceedings.mlr.press/v202/tang23c.html.

Related Material