Neural Autoregressive Flows

Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:2078-2087, 2018.

Abstract

Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF) (Papamakarios et al., 2017), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time (Oord et al., 2017), via Inverse Autoregressive Flows (IAF) (Kingma et al., 2016). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions. Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-huang18d, title = {Neural Autoregressive Flows}, author = {Huang, Chin-Wei and Krueger, David and Lacoste, Alexandre and Courville, Aaron}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {2078--2087}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/huang18d/huang18d.pdf}, url = {https://proceedings.mlr.press/v80/huang18d.html}, abstract = {Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF) (Papamakarios et al., 2017), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time (Oord et al., 2017), via Inverse Autoregressive Flows (IAF) (Kingma et al., 2016). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions. Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.} }
Endnote
%0 Conference Paper %T Neural Autoregressive Flows %A Chin-Wei Huang %A David Krueger %A Alexandre Lacoste %A Aaron Courville %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-huang18d %I PMLR %P 2078--2087 %U https://proceedings.mlr.press/v80/huang18d.html %V 80 %X Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF) (Papamakarios et al., 2017), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time (Oord et al., 2017), via Inverse Autoregressive Flows (IAF) (Kingma et al., 2016). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions. Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.
APA
Huang, C., Krueger, D., Lacoste, A. & Courville, A.. (2018). Neural Autoregressive Flows. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:2078-2087 Available from https://proceedings.mlr.press/v80/huang18d.html.

Related Material