Autoregressive Quantile Networks for Generative Modeling

Georg Ostrovski, Will Dabney, Remi Munos
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3936-3945, 2018.

Abstract

We introduce autoregressive implicit quantile networks (AIQN), a fundamentally different approach to generative modeling than those commonly used, that implicitly captures the distribution using quantile regression. AIQN is able to achieve superior perceptual quality and improvements in evaluation metrics, without incurring a loss of sample diversity. The method can be applied to many existing models and architectures. In this work we extend the PixelCNN model with AIQN and demonstrate results on CIFAR-10 and ImageNet using Inception scores, FID, non-cherry-picked samples, and inpainting results. We consistently observe that AIQN yields a highly stable algorithm that improves perceptual quality while maintaining a highly diverse distribution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-ostrovski18a, title = {Autoregressive Quantile Networks for Generative Modeling}, author = {Ostrovski, Georg and Dabney, Will and Munos, Remi}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3936--3945}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/ostrovski18a/ostrovski18a.pdf}, url = {https://proceedings.mlr.press/v80/ostrovski18a.html}, abstract = {We introduce autoregressive implicit quantile networks (AIQN), a fundamentally different approach to generative modeling than those commonly used, that implicitly captures the distribution using quantile regression. AIQN is able to achieve superior perceptual quality and improvements in evaluation metrics, without incurring a loss of sample diversity. The method can be applied to many existing models and architectures. In this work we extend the PixelCNN model with AIQN and demonstrate results on CIFAR-10 and ImageNet using Inception scores, FID, non-cherry-picked samples, and inpainting results. We consistently observe that AIQN yields a highly stable algorithm that improves perceptual quality while maintaining a highly diverse distribution.} }
Endnote
%0 Conference Paper %T Autoregressive Quantile Networks for Generative Modeling %A Georg Ostrovski %A Will Dabney %A Remi Munos %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-ostrovski18a %I PMLR %P 3936--3945 %U https://proceedings.mlr.press/v80/ostrovski18a.html %V 80 %X We introduce autoregressive implicit quantile networks (AIQN), a fundamentally different approach to generative modeling than those commonly used, that implicitly captures the distribution using quantile regression. AIQN is able to achieve superior perceptual quality and improvements in evaluation metrics, without incurring a loss of sample diversity. The method can be applied to many existing models and architectures. In this work we extend the PixelCNN model with AIQN and demonstrate results on CIFAR-10 and ImageNet using Inception scores, FID, non-cherry-picked samples, and inpainting results. We consistently observe that AIQN yields a highly stable algorithm that improves perceptual quality while maintaining a highly diverse distribution.
APA
Ostrovski, G., Dabney, W. & Munos, R.. (2018). Autoregressive Quantile Networks for Generative Modeling. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3936-3945 Available from https://proceedings.mlr.press/v80/ostrovski18a.html.

Related Material