Batch norm with entropic regularization turns deterministic autoencoders into generative models

Amur Ghose, Abdullah Rashwan, Pascal Poupart
Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 124:1079-1088, 2020.

Abstract

The variational autoencoder is a well defined deep generative model that utilizes an encoder-decoder framework where an encoding neural network outputs a non-deterministic code for reconstructing an input. The encoder achieves this by sampling from a distribution for every input, instead of outputting a deterministic code per input. The great advantage of this process is that it allows the use of the network as a generative model for sampling from the data distribution beyond provided samples for training. We show in this work that utilizing batch normalization as a source for non-determinism suffices to turn deterministic autoencoders into generative models on par with variational ones, so long as we add a suitable entropic regularization to the training objective.

Cite this Paper


BibTeX
@InProceedings{pmlr-v124-ghose20a, title = {Batch norm with entropic regularization turns deterministic autoencoders into generative models}, author = {Ghose, Amur and Rashwan, Abdullah and Poupart, Pascal}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, pages = {1079--1088}, year = {2020}, editor = {Peters, Jonas and Sontag, David}, volume = {124}, series = {Proceedings of Machine Learning Research}, month = {03--06 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v124/ghose20a/ghose20a.pdf}, url = {https://proceedings.mlr.press/v124/ghose20a.html}, abstract = {The variational autoencoder is a well defined deep generative model that utilizes an encoder-decoder framework where an encoding neural network outputs a non-deterministic code for reconstructing an input. The encoder achieves this by sampling from a distribution for every input, instead of outputting a deterministic code per input. The great advantage of this process is that it allows the use of the network as a generative model for sampling from the data distribution beyond provided samples for training. We show in this work that utilizing batch normalization as a source for non-determinism suffices to turn deterministic autoencoders into generative models on par with variational ones, so long as we add a suitable entropic regularization to the training objective.} }
Endnote
%0 Conference Paper %T Batch norm with entropic regularization turns deterministic autoencoders into generative models %A Amur Ghose %A Abdullah Rashwan %A Pascal Poupart %B Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI) %C Proceedings of Machine Learning Research %D 2020 %E Jonas Peters %E David Sontag %F pmlr-v124-ghose20a %I PMLR %P 1079--1088 %U https://proceedings.mlr.press/v124/ghose20a.html %V 124 %X The variational autoencoder is a well defined deep generative model that utilizes an encoder-decoder framework where an encoding neural network outputs a non-deterministic code for reconstructing an input. The encoder achieves this by sampling from a distribution for every input, instead of outputting a deterministic code per input. The great advantage of this process is that it allows the use of the network as a generative model for sampling from the data distribution beyond provided samples for training. We show in this work that utilizing batch normalization as a source for non-determinism suffices to turn deterministic autoencoders into generative models on par with variational ones, so long as we add a suitable entropic regularization to the training objective.
APA
Ghose, A., Rashwan, A. & Poupart, P.. (2020). Batch norm with entropic regularization turns deterministic autoencoders into generative models. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), in Proceedings of Machine Learning Research 124:1079-1088 Available from https://proceedings.mlr.press/v124/ghose20a.html.

Related Material