Learning Neurosymbolic Generative Models via Program Synthesis

Halley Young, Osbert Bastani, Mayur Naik
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7144-7153, 2019.

Abstract

Generative models have become significantly more powerful in recent years. However, these models continue to have difficulty capturing global structure in data. For example, images of buildings typically contain spatial patterns such as windows repeating at regular intervals, but state-of-the-art models have difficulty generating these patterns. We propose to address this problem by incorporating programs representing global structure into generative models{—}e.g., a 2D for-loop may represent a repeating pattern of windows{—}along with a framework for learning these models by leveraging program synthesis to obtain training data. On both synthetic and real-world data, we demonstrate that our approach substantially outperforms state-of-the-art at both generating and completing images with global structure.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-young19a, title = {Learning Neurosymbolic Generative Models via Program Synthesis}, author = {Young, Halley and Bastani, Osbert and Naik, Mayur}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7144--7153}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/young19a/young19a.pdf}, url = {https://proceedings.mlr.press/v97/young19a.html}, abstract = {Generative models have become significantly more powerful in recent years. However, these models continue to have difficulty capturing global structure in data. For example, images of buildings typically contain spatial patterns such as windows repeating at regular intervals, but state-of-the-art models have difficulty generating these patterns. We propose to address this problem by incorporating programs representing global structure into generative models{—}e.g., a 2D for-loop may represent a repeating pattern of windows{—}along with a framework for learning these models by leveraging program synthesis to obtain training data. On both synthetic and real-world data, we demonstrate that our approach substantially outperforms state-of-the-art at both generating and completing images with global structure.} }
Endnote
%0 Conference Paper %T Learning Neurosymbolic Generative Models via Program Synthesis %A Halley Young %A Osbert Bastani %A Mayur Naik %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-young19a %I PMLR %P 7144--7153 %U https://proceedings.mlr.press/v97/young19a.html %V 97 %X Generative models have become significantly more powerful in recent years. However, these models continue to have difficulty capturing global structure in data. For example, images of buildings typically contain spatial patterns such as windows repeating at regular intervals, but state-of-the-art models have difficulty generating these patterns. We propose to address this problem by incorporating programs representing global structure into generative models{—}e.g., a 2D for-loop may represent a repeating pattern of windows{—}along with a framework for learning these models by leveraging program synthesis to obtain training data. On both synthetic and real-world data, we demonstrate that our approach substantially outperforms state-of-the-art at both generating and completing images with global structure.
APA
Young, H., Bastani, O. & Naik, M.. (2019). Learning Neurosymbolic Generative Models via Program Synthesis. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7144-7153 Available from https://proceedings.mlr.press/v97/young19a.html.

Related Material