Convergence Rates of Variational Inference in Sparse Deep Learning

Badr-Eddine Chérief-Abdellatif
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1831-1842, 2020.

Abstract

Variational inference is becoming more and more popular for approximating intractable posterior distributions in Bayesian statistics and machine learning. Meanwhile, a few recent works have provided theoretical justification and new insights on deep neural networks for estimating smooth functions in usual settings such as nonparametric regression. In this paper, we show that variational inference for sparse deep learning retains precisely the same generalization properties than exact Bayesian inference. In particular, we show that a wise choice of the neural network architecture leads to near-minimax rates of convergence for Hölder smooth functions. Additionally, we show that the model selection framework over the architecture of the network via ELBO maximization does not overfit and adaptively achieves the optimal rate of convergence.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-cherief-abdellatif20a, title = {Convergence Rates of Variational Inference in Sparse Deep Learning}, author = {Ch{\'e}rief-Abdellatif, Badr-Eddine}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1831--1842}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/cherief-abdellatif20a/cherief-abdellatif20a.pdf}, url = {https://proceedings.mlr.press/v119/cherief-abdellatif20a.html}, abstract = {Variational inference is becoming more and more popular for approximating intractable posterior distributions in Bayesian statistics and machine learning. Meanwhile, a few recent works have provided theoretical justification and new insights on deep neural networks for estimating smooth functions in usual settings such as nonparametric regression. In this paper, we show that variational inference for sparse deep learning retains precisely the same generalization properties than exact Bayesian inference. In particular, we show that a wise choice of the neural network architecture leads to near-minimax rates of convergence for Hölder smooth functions. Additionally, we show that the model selection framework over the architecture of the network via ELBO maximization does not overfit and adaptively achieves the optimal rate of convergence.} }
Endnote
%0 Conference Paper %T Convergence Rates of Variational Inference in Sparse Deep Learning %A Badr-Eddine Chérief-Abdellatif %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-cherief-abdellatif20a %I PMLR %P 1831--1842 %U https://proceedings.mlr.press/v119/cherief-abdellatif20a.html %V 119 %X Variational inference is becoming more and more popular for approximating intractable posterior distributions in Bayesian statistics and machine learning. Meanwhile, a few recent works have provided theoretical justification and new insights on deep neural networks for estimating smooth functions in usual settings such as nonparametric regression. In this paper, we show that variational inference for sparse deep learning retains precisely the same generalization properties than exact Bayesian inference. In particular, we show that a wise choice of the neural network architecture leads to near-minimax rates of convergence for Hölder smooth functions. Additionally, we show that the model selection framework over the architecture of the network via ELBO maximization does not overfit and adaptively achieves the optimal rate of convergence.
APA
Chérief-Abdellatif, B.. (2020). Convergence Rates of Variational Inference in Sparse Deep Learning. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1831-1842 Available from https://proceedings.mlr.press/v119/cherief-abdellatif20a.html.

Related Material