Non-Parametric Priors For Generative Adversarial Networks

Rajhans Singh, Pavan Turaga, Suren Jayasuriya, Ravi Garg, Martin Braun
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:5838-5847, 2019.

Abstract

The advent of generative adversarial networks (GAN) has enabled new capabilities in synthesis, interpolation, and data augmentation heretofore considered very challenging. However, one of the common assumptions in most GAN architectures is the assumption of simple parametric latent-space distributions. While easy to implement, a simple latent-space distribution can be problematic for uses such as interpolation. This is due to distributional mismatches when samples are interpolated in the latent space. We present a straightforward formalization of this problem; using basic results from probability theory and off-the-shelf-optimization tools, we develop ways to arrive at appropriate non-parametric priors. The obtained prior exhibits unusual qualitative properties in terms of its shape, and quantitative benefits in terms of lower divergence with its mid-point distribution. We demonstrate that our designed prior helps improve image generation along any Euclidean straight line during interpolation, both qualitatively and quantitatively, without any additional training or architectural modifications. The proposed formulation is quite flexible, paving the way to impose newer constraints on the latent-space statistics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-singh19a, title = {Non-Parametric Priors For Generative Adversarial Networks}, author = {Singh, Rajhans and Turaga, Pavan and Jayasuriya, Suren and Garg, Ravi and Braun, Martin}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {5838--5847}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/singh19a/singh19a.pdf}, url = {https://proceedings.mlr.press/v97/singh19a.html}, abstract = {The advent of generative adversarial networks (GAN) has enabled new capabilities in synthesis, interpolation, and data augmentation heretofore considered very challenging. However, one of the common assumptions in most GAN architectures is the assumption of simple parametric latent-space distributions. While easy to implement, a simple latent-space distribution can be problematic for uses such as interpolation. This is due to distributional mismatches when samples are interpolated in the latent space. We present a straightforward formalization of this problem; using basic results from probability theory and off-the-shelf-optimization tools, we develop ways to arrive at appropriate non-parametric priors. The obtained prior exhibits unusual qualitative properties in terms of its shape, and quantitative benefits in terms of lower divergence with its mid-point distribution. We demonstrate that our designed prior helps improve image generation along any Euclidean straight line during interpolation, both qualitatively and quantitatively, without any additional training or architectural modifications. The proposed formulation is quite flexible, paving the way to impose newer constraints on the latent-space statistics.} }
Endnote
%0 Conference Paper %T Non-Parametric Priors For Generative Adversarial Networks %A Rajhans Singh %A Pavan Turaga %A Suren Jayasuriya %A Ravi Garg %A Martin Braun %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-singh19a %I PMLR %P 5838--5847 %U https://proceedings.mlr.press/v97/singh19a.html %V 97 %X The advent of generative adversarial networks (GAN) has enabled new capabilities in synthesis, interpolation, and data augmentation heretofore considered very challenging. However, one of the common assumptions in most GAN architectures is the assumption of simple parametric latent-space distributions. While easy to implement, a simple latent-space distribution can be problematic for uses such as interpolation. This is due to distributional mismatches when samples are interpolated in the latent space. We present a straightforward formalization of this problem; using basic results from probability theory and off-the-shelf-optimization tools, we develop ways to arrive at appropriate non-parametric priors. The obtained prior exhibits unusual qualitative properties in terms of its shape, and quantitative benefits in terms of lower divergence with its mid-point distribution. We demonstrate that our designed prior helps improve image generation along any Euclidean straight line during interpolation, both qualitatively and quantitatively, without any additional training or architectural modifications. The proposed formulation is quite flexible, paving the way to impose newer constraints on the latent-space statistics.
APA
Singh, R., Turaga, P., Jayasuriya, S., Garg, R. & Braun, M.. (2019). Non-Parametric Priors For Generative Adversarial Networks. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:5838-5847 Available from https://proceedings.mlr.press/v97/singh19a.html.

Related Material