Quantitative Universal Approximation Bounds for Deep Belief Networks

Julian Sieber, Johann Gehringer
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:31773-31787, 2023.

Abstract

We show that deep belief networks with binary hidden units can approximate any multivariate probability density under very mild integrability requirements on the parental density of the visible nodes. The approximation is measured in the $L^q$-norm for $q\in[1,\infty]$ ($q=\infty$ corresponding to the supremum norm) and in Kullback-Leibler divergence. Furthermore, we establish sharp quantitative bounds on the approximation error in terms of the number of hidden units.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-sieber23a, title = {Quantitative Universal Approximation Bounds for Deep Belief Networks}, author = {Sieber, Julian and Gehringer, Johann}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {31773--31787}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/sieber23a/sieber23a.pdf}, url = {https://proceedings.mlr.press/v202/sieber23a.html}, abstract = {We show that deep belief networks with binary hidden units can approximate any multivariate probability density under very mild integrability requirements on the parental density of the visible nodes. The approximation is measured in the $L^q$-norm for $q\in[1,\infty]$ ($q=\infty$ corresponding to the supremum norm) and in Kullback-Leibler divergence. Furthermore, we establish sharp quantitative bounds on the approximation error in terms of the number of hidden units.} }
Endnote
%0 Conference Paper %T Quantitative Universal Approximation Bounds for Deep Belief Networks %A Julian Sieber %A Johann Gehringer %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-sieber23a %I PMLR %P 31773--31787 %U https://proceedings.mlr.press/v202/sieber23a.html %V 202 %X We show that deep belief networks with binary hidden units can approximate any multivariate probability density under very mild integrability requirements on the parental density of the visible nodes. The approximation is measured in the $L^q$-norm for $q\in[1,\infty]$ ($q=\infty$ corresponding to the supremum norm) and in Kullback-Leibler divergence. Furthermore, we establish sharp quantitative bounds on the approximation error in terms of the number of hidden units.
APA
Sieber, J. & Gehringer, J.. (2023). Quantitative Universal Approximation Bounds for Deep Belief Networks. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:31773-31787 Available from https://proceedings.mlr.press/v202/sieber23a.html.

Related Material