One-Shot Generalization in Deep Generative Models

Danilo Rezende,  Shakir, Ivo Danihelka, Karol Gregor, Daan Wierstra
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1521-1529, 2016.

Abstract

Humans have an impressive ability to reason about new concepts and experiences from just a single example. In particular, humans have an ability for one-shot generalization: an ability to encounter a new concept, understand its structure, and then be able to generate compelling alternative variations of the concept. We develop machine learning systems with this important capacity by developing new deep generative models, models that combine the representational power of deep learning with the inferential power of Bayesian reasoning. We develop a class of sequential generative models that are built on the principles of feedback and attention. These two characteristics lead to generative models that are among the state-of-the art in density estimation and image generation. We demonstrate the one-shot generalization ability of our models using three tasks: unconditional sampling, generating new exemplars of a given concept, and generating new exemplars of a family of concepts. In all cases our models are able to generate compelling and diverse samples—having seen new examples just once—providing an important class of general-purpose models for one-shot machine learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-rezende16, title = {One-Shot Generalization in Deep Generative Models}, author = {Rezende, Danilo and Shakir, and Danihelka, Ivo and Gregor, Karol and Wierstra, Daan}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1521--1529}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/rezende16.pdf}, url = {https://proceedings.mlr.press/v48/rezende16.html}, abstract = {Humans have an impressive ability to reason about new concepts and experiences from just a single example. In particular, humans have an ability for one-shot generalization: an ability to encounter a new concept, understand its structure, and then be able to generate compelling alternative variations of the concept. We develop machine learning systems with this important capacity by developing new deep generative models, models that combine the representational power of deep learning with the inferential power of Bayesian reasoning. We develop a class of sequential generative models that are built on the principles of feedback and attention. These two characteristics lead to generative models that are among the state-of-the art in density estimation and image generation. We demonstrate the one-shot generalization ability of our models using three tasks: unconditional sampling, generating new exemplars of a given concept, and generating new exemplars of a family of concepts. In all cases our models are able to generate compelling and diverse samples—having seen new examples just once—providing an important class of general-purpose models for one-shot machine learning.} }
Endnote
%0 Conference Paper %T One-Shot Generalization in Deep Generative Models %A Danilo Rezende %A Shakir %A Ivo Danihelka %A Karol Gregor %A Daan Wierstra %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-rezende16 %I PMLR %P 1521--1529 %U https://proceedings.mlr.press/v48/rezende16.html %V 48 %X Humans have an impressive ability to reason about new concepts and experiences from just a single example. In particular, humans have an ability for one-shot generalization: an ability to encounter a new concept, understand its structure, and then be able to generate compelling alternative variations of the concept. We develop machine learning systems with this important capacity by developing new deep generative models, models that combine the representational power of deep learning with the inferential power of Bayesian reasoning. We develop a class of sequential generative models that are built on the principles of feedback and attention. These two characteristics lead to generative models that are among the state-of-the art in density estimation and image generation. We demonstrate the one-shot generalization ability of our models using three tasks: unconditional sampling, generating new exemplars of a given concept, and generating new exemplars of a family of concepts. In all cases our models are able to generate compelling and diverse samples—having seen new examples just once—providing an important class of general-purpose models for one-shot machine learning.
RIS
TY - CPAPER TI - One-Shot Generalization in Deep Generative Models AU - Danilo Rezende AU - Shakir AU - Ivo Danihelka AU - Karol Gregor AU - Daan Wierstra BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-rezende16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1521 EP - 1529 L1 - http://proceedings.mlr.press/v48/rezende16.pdf UR - https://proceedings.mlr.press/v48/rezende16.html AB - Humans have an impressive ability to reason about new concepts and experiences from just a single example. In particular, humans have an ability for one-shot generalization: an ability to encounter a new concept, understand its structure, and then be able to generate compelling alternative variations of the concept. We develop machine learning systems with this important capacity by developing new deep generative models, models that combine the representational power of deep learning with the inferential power of Bayesian reasoning. We develop a class of sequential generative models that are built on the principles of feedback and attention. These two characteristics lead to generative models that are among the state-of-the art in density estimation and image generation. We demonstrate the one-shot generalization ability of our models using three tasks: unconditional sampling, generating new exemplars of a given concept, and generating new exemplars of a family of concepts. In all cases our models are able to generate compelling and diverse samples—having seen new examples just once—providing an important class of general-purpose models for one-shot machine learning. ER -
APA
Rezende, D., Shakir, , Danihelka, I., Gregor, K. & Wierstra, D.. (2016). One-Shot Generalization in Deep Generative Models. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1521-1529 Available from https://proceedings.mlr.press/v48/rezende16.html.

Related Material