Gaussianization Flows

Chenlin Meng, Yang Song, Jiaming Song, Stefano Ermon
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:4336-4345, 2020.

Abstract

Iterative Gaussianization is a fixed-point iteration procedure that allows one to transform a continuous distribution to Gaussian distribution. Based on iterative Gaussianization, we propose a new type of normalizing flow models that grants both efficient computation of likelihoods and efficient inversion for sample generation. We demonstrate that this new family of flow models, named as Gaussianization flows, are universal approximators for continuous probability distributions under some regularity conditions. This guaranteed expressivity, enabling them to capture multimodal target distributions better without compromising the efficiency in sample generation. Experimentally, we show that Gaussianization flows achieve better or comparable performance on several tabular datasets, compared to other efficiently invertible flow models such as Real NVP, Glow and FFJORD. In particular, Gaussianization flows are easier to initialize, demonstrate better robustness with respect to different transformations of the training data, and generalize better on small training sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-meng20b, title = {Gaussianization Flows}, author = {Meng, Chenlin and Song, Yang and Song, Jiaming and Ermon, Stefano}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {4336--4345}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/meng20b/meng20b.pdf}, url = {https://proceedings.mlr.press/v108/meng20b.html}, abstract = {Iterative Gaussianization is a fixed-point iteration procedure that allows one to transform a continuous distribution to Gaussian distribution. Based on iterative Gaussianization, we propose a new type of normalizing flow models that grants both efficient computation of likelihoods and efficient inversion for sample generation. We demonstrate that this new family of flow models, named as Gaussianization flows, are universal approximators for continuous probability distributions under some regularity conditions. This guaranteed expressivity, enabling them to capture multimodal target distributions better without compromising the efficiency in sample generation. Experimentally, we show that Gaussianization flows achieve better or comparable performance on several tabular datasets, compared to other efficiently invertible flow models such as Real NVP, Glow and FFJORD. In particular, Gaussianization flows are easier to initialize, demonstrate better robustness with respect to different transformations of the training data, and generalize better on small training sets.} }
Endnote
%0 Conference Paper %T Gaussianization Flows %A Chenlin Meng %A Yang Song %A Jiaming Song %A Stefano Ermon %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-meng20b %I PMLR %P 4336--4345 %U https://proceedings.mlr.press/v108/meng20b.html %V 108 %X Iterative Gaussianization is a fixed-point iteration procedure that allows one to transform a continuous distribution to Gaussian distribution. Based on iterative Gaussianization, we propose a new type of normalizing flow models that grants both efficient computation of likelihoods and efficient inversion for sample generation. We demonstrate that this new family of flow models, named as Gaussianization flows, are universal approximators for continuous probability distributions under some regularity conditions. This guaranteed expressivity, enabling them to capture multimodal target distributions better without compromising the efficiency in sample generation. Experimentally, we show that Gaussianization flows achieve better or comparable performance on several tabular datasets, compared to other efficiently invertible flow models such as Real NVP, Glow and FFJORD. In particular, Gaussianization flows are easier to initialize, demonstrate better robustness with respect to different transformations of the training data, and generalize better on small training sets.
APA
Meng, C., Song, Y., Song, J. & Ermon, S.. (2020). Gaussianization Flows. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:4336-4345 Available from https://proceedings.mlr.press/v108/meng20b.html.

Related Material