Learning to learn generative programs with Memoised Wake-Sleep

Luke Hewitt, Tuan Anh Le, Joshua Tenenbaum
Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 124:1278-1287, 2020.

Abstract

We study a class of neuro-symbolic generative models in which neural networks are used both for inference and as priors over symbolic, data-generating programs. As generative models, these programs capture compositional structures in a naturally explainable form. To tackle the challenge of performing program induction as an ‘inner-loop’ to learning, we propose the Memoised Wake-Sleep (MWS) algorithm, which extends Wake Sleep by explicitly storing and reusing the best programs discovered by the inference network throughout training. We use MWS to learn accurate, explainable models in three challenging domains: stroke-based character modelling, cellular automata, and few-shot learning in a novel dataset of real-world string concepts.

Cite this Paper


BibTeX
@InProceedings{pmlr-v124-hewitt20a, title = {Learning to learn generative programs with Memoised Wake-Sleep}, author = {Hewitt, Luke and Anh Le, Tuan and Tenenbaum, Joshua}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, pages = {1278--1287}, year = {2020}, editor = {Peters, Jonas and Sontag, David}, volume = {124}, series = {Proceedings of Machine Learning Research}, month = {03--06 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v124/hewitt20a/hewitt20a.pdf}, url = {https://proceedings.mlr.press/v124/hewitt20a.html}, abstract = {We study a class of neuro-symbolic generative models in which neural networks are used both for inference and as priors over symbolic, data-generating programs. As generative models, these programs capture compositional structures in a naturally explainable form. To tackle the challenge of performing program induction as an ‘inner-loop’ to learning, we propose the Memoised Wake-Sleep (MWS) algorithm, which extends Wake Sleep by explicitly storing and reusing the best programs discovered by the inference network throughout training. We use MWS to learn accurate, explainable models in three challenging domains: stroke-based character modelling, cellular automata, and few-shot learning in a novel dataset of real-world string concepts.} }
Endnote
%0 Conference Paper %T Learning to learn generative programs with Memoised Wake-Sleep %A Luke Hewitt %A Tuan Anh Le %A Joshua Tenenbaum %B Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI) %C Proceedings of Machine Learning Research %D 2020 %E Jonas Peters %E David Sontag %F pmlr-v124-hewitt20a %I PMLR %P 1278--1287 %U https://proceedings.mlr.press/v124/hewitt20a.html %V 124 %X We study a class of neuro-symbolic generative models in which neural networks are used both for inference and as priors over symbolic, data-generating programs. As generative models, these programs capture compositional structures in a naturally explainable form. To tackle the challenge of performing program induction as an ‘inner-loop’ to learning, we propose the Memoised Wake-Sleep (MWS) algorithm, which extends Wake Sleep by explicitly storing and reusing the best programs discovered by the inference network throughout training. We use MWS to learn accurate, explainable models in three challenging domains: stroke-based character modelling, cellular automata, and few-shot learning in a novel dataset of real-world string concepts.
APA
Hewitt, L., Anh Le, T. & Tenenbaum, J.. (2020). Learning to learn generative programs with Memoised Wake-Sleep. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), in Proceedings of Machine Learning Research 124:1278-1287 Available from https://proceedings.mlr.press/v124/hewitt20a.html.

Related Material