Statistical guarantees for generative models without domination

Nicolas Schreuder, Victor-Emmanuel Brunel, Arnak Dalalyan
Proceedings of the 32nd International Conference on Algorithmic Learning Theory, PMLR 132:1051-1071, 2021.

Abstract

In this paper, we introduce a convenient framework for studying (adversarial) generative models from a statistical perspective. It consists in modeling the generative device as a smooth transformation of the unit hypercube of a dimension that is much smaller than that of the ambient space and measuring the quality of the generative model by means of an integral probability metric. In the particular case of integral probability metric defined through a smoothness class, we establish a risk bound quantifying the role of various parameters. In particular, it clearly shows the impact of dimension reduction on the error of the generative model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v132-schreuder21a, title = {Statistical guarantees for generative models without domination}, author = {Schreuder, Nicolas and Brunel, Victor-Emmanuel and Dalalyan, Arnak}, booktitle = {Proceedings of the 32nd International Conference on Algorithmic Learning Theory}, pages = {1051--1071}, year = {2021}, editor = {Feldman, Vitaly and Ligett, Katrina and Sabato, Sivan}, volume = {132}, series = {Proceedings of Machine Learning Research}, month = {16--19 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v132/schreuder21a/schreuder21a.pdf}, url = {https://proceedings.mlr.press/v132/schreuder21a.html}, abstract = {In this paper, we introduce a convenient framework for studying (adversarial) generative models from a statistical perspective. It consists in modeling the generative device as a smooth transformation of the unit hypercube of a dimension that is much smaller than that of the ambient space and measuring the quality of the generative model by means of an integral probability metric. In the particular case of integral probability metric defined through a smoothness class, we establish a risk bound quantifying the role of various parameters. In particular, it clearly shows the impact of dimension reduction on the error of the generative model.} }
Endnote
%0 Conference Paper %T Statistical guarantees for generative models without domination %A Nicolas Schreuder %A Victor-Emmanuel Brunel %A Arnak Dalalyan %B Proceedings of the 32nd International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Vitaly Feldman %E Katrina Ligett %E Sivan Sabato %F pmlr-v132-schreuder21a %I PMLR %P 1051--1071 %U https://proceedings.mlr.press/v132/schreuder21a.html %V 132 %X In this paper, we introduce a convenient framework for studying (adversarial) generative models from a statistical perspective. It consists in modeling the generative device as a smooth transformation of the unit hypercube of a dimension that is much smaller than that of the ambient space and measuring the quality of the generative model by means of an integral probability metric. In the particular case of integral probability metric defined through a smoothness class, we establish a risk bound quantifying the role of various parameters. In particular, it clearly shows the impact of dimension reduction on the error of the generative model.
APA
Schreuder, N., Brunel, V. & Dalalyan, A.. (2021). Statistical guarantees for generative models without domination. Proceedings of the 32nd International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 132:1051-1071 Available from https://proceedings.mlr.press/v132/schreuder21a.html.

Related Material