Minimax optimal density estimation using a shallow generative model with a one-dimensional latent variable

Hyeok Kyu Kwon, Minwoo Chae
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:469-477, 2024.

Abstract

A deep generative model yields an implicit estimator for the unknown distribution or density function of the observation. This paper investigates some statistical properties of the implicit density estimator pursued by VAE-type methods from a nonparametric density estimation framework. More specifically, we obtain convergence rates of the VAE-type density estimator under the assumption that the underlying true density function belongs to a locally Holder class. Remarkably, a near minimax optimal rate with respect to the Hellinger metric can be achieved by the simplest network architecture, a shallow generative model with a one-dimensional latent variable.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-kyu-kwon24a, title = {Minimax optimal density estimation using a shallow generative model with a one-dimensional latent variable}, author = {Kyu Kwon, Hyeok and Chae, Minwoo}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {469--477}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/kyu-kwon24a/kyu-kwon24a.pdf}, url = {https://proceedings.mlr.press/v238/kyu-kwon24a.html}, abstract = {A deep generative model yields an implicit estimator for the unknown distribution or density function of the observation. This paper investigates some statistical properties of the implicit density estimator pursued by VAE-type methods from a nonparametric density estimation framework. More specifically, we obtain convergence rates of the VAE-type density estimator under the assumption that the underlying true density function belongs to a locally Holder class. Remarkably, a near minimax optimal rate with respect to the Hellinger metric can be achieved by the simplest network architecture, a shallow generative model with a one-dimensional latent variable.} }
Endnote
%0 Conference Paper %T Minimax optimal density estimation using a shallow generative model with a one-dimensional latent variable %A Hyeok Kyu Kwon %A Minwoo Chae %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-kyu-kwon24a %I PMLR %P 469--477 %U https://proceedings.mlr.press/v238/kyu-kwon24a.html %V 238 %X A deep generative model yields an implicit estimator for the unknown distribution or density function of the observation. This paper investigates some statistical properties of the implicit density estimator pursued by VAE-type methods from a nonparametric density estimation framework. More specifically, we obtain convergence rates of the VAE-type density estimator under the assumption that the underlying true density function belongs to a locally Holder class. Remarkably, a near minimax optimal rate with respect to the Hellinger metric can be achieved by the simplest network architecture, a shallow generative model with a one-dimensional latent variable.
APA
Kyu Kwon, H. & Chae, M.. (2024). Minimax optimal density estimation using a shallow generative model with a one-dimensional latent variable. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:469-477 Available from https://proceedings.mlr.press/v238/kyu-kwon24a.html.

Related Material