Generative Models as Distributions of Functions

Emilien Dupont, Yee Whye Teh, Arnaud Doucet
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:2989-3015, 2022.

Abstract

Generative models are typically trained on grid-like data such as images. As a result, the size of these models usually scales directly with the underlying grid resolution. In this paper, we abandon discretized grids and instead parameterize individual data points by continuous functions. We then build generative models by learning distributions over such functions. By treating data points as functions, we can abstract away from the specific type of data we train on and construct models that are agnostic to discretization. To train our model, we use an adversarial approach with a discriminator that acts on continuous signals. Through experiments on a wide variety of data modalities including images, 3D shapes and climate data, we demonstrate that our model can learn rich distributions of functions independently of data type and resolution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-dupont22a, title = { Generative Models as Distributions of Functions }, author = {Dupont, Emilien and Whye Teh, Yee and Doucet, Arnaud}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {2989--3015}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/dupont22a/dupont22a.pdf}, url = {https://proceedings.mlr.press/v151/dupont22a.html}, abstract = { Generative models are typically trained on grid-like data such as images. As a result, the size of these models usually scales directly with the underlying grid resolution. In this paper, we abandon discretized grids and instead parameterize individual data points by continuous functions. We then build generative models by learning distributions over such functions. By treating data points as functions, we can abstract away from the specific type of data we train on and construct models that are agnostic to discretization. To train our model, we use an adversarial approach with a discriminator that acts on continuous signals. Through experiments on a wide variety of data modalities including images, 3D shapes and climate data, we demonstrate that our model can learn rich distributions of functions independently of data type and resolution. } }
Endnote
%0 Conference Paper %T Generative Models as Distributions of Functions %A Emilien Dupont %A Yee Whye Teh %A Arnaud Doucet %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-dupont22a %I PMLR %P 2989--3015 %U https://proceedings.mlr.press/v151/dupont22a.html %V 151 %X Generative models are typically trained on grid-like data such as images. As a result, the size of these models usually scales directly with the underlying grid resolution. In this paper, we abandon discretized grids and instead parameterize individual data points by continuous functions. We then build generative models by learning distributions over such functions. By treating data points as functions, we can abstract away from the specific type of data we train on and construct models that are agnostic to discretization. To train our model, we use an adversarial approach with a discriminator that acts on continuous signals. Through experiments on a wide variety of data modalities including images, 3D shapes and climate data, we demonstrate that our model can learn rich distributions of functions independently of data type and resolution.
APA
Dupont, E., Whye Teh, Y. & Doucet, A.. (2022). Generative Models as Distributions of Functions . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:2989-3015 Available from https://proceedings.mlr.press/v151/dupont22a.html.

Related Material