Subspace Inference for Bayesian Deep Learning

Pavel Izmailov, Wesley J. Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson
Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, PMLR 115:1169-1179, 2020.

Abstract

Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty. However, scaling Bayesian inference techniques to deep neural networks is challenging due to the high dimensionality of the parameter space. In this paper, we construct low-dimensional subspaces of parameter space, such as the first principal components of the stochastic gradient descent (SGD) trajectory, which contain diverse sets of high performing models. In these subspaces, we are able to apply elliptical slice sampling and variational inference, which struggle in the full parameter space. We show that Bayesian model averaging over the induced posterior in these subspaces produces accurate predictions and well-calibrated predictive uncertainty for both regression and image classification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v115-izmailov20a, title = {Subspace Inference for Bayesian Deep Learning}, author = {Izmailov, Pavel and Maddox, Wesley J. and Kirichenko, Polina and Garipov, Timur and Vetrov, Dmitry and Wilson, Andrew Gordon}, booktitle = {Proceedings of The 35th Uncertainty in Artificial Intelligence Conference}, pages = {1169--1179}, year = {2020}, editor = {Adams, Ryan P. and Gogate, Vibhav}, volume = {115}, series = {Proceedings of Machine Learning Research}, month = {22--25 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v115/izmailov20a/izmailov20a.pdf}, url = { http://proceedings.mlr.press/v115/izmailov20a.html }, abstract = {Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty. However, scaling Bayesian inference techniques to deep neural networks is challenging due to the high dimensionality of the parameter space. In this paper, we construct low-dimensional subspaces of parameter space, such as the first principal components of the stochastic gradient descent (SGD) trajectory, which contain diverse sets of high performing models. In these subspaces, we are able to apply elliptical slice sampling and variational inference, which struggle in the full parameter space. We show that Bayesian model averaging over the induced posterior in these subspaces produces accurate predictions and well-calibrated predictive uncertainty for both regression and image classification.} }
Endnote
%0 Conference Paper %T Subspace Inference for Bayesian Deep Learning %A Pavel Izmailov %A Wesley J. Maddox %A Polina Kirichenko %A Timur Garipov %A Dmitry Vetrov %A Andrew Gordon Wilson %B Proceedings of The 35th Uncertainty in Artificial Intelligence Conference %C Proceedings of Machine Learning Research %D 2020 %E Ryan P. Adams %E Vibhav Gogate %F pmlr-v115-izmailov20a %I PMLR %P 1169--1179 %U http://proceedings.mlr.press/v115/izmailov20a.html %V 115 %X Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty. However, scaling Bayesian inference techniques to deep neural networks is challenging due to the high dimensionality of the parameter space. In this paper, we construct low-dimensional subspaces of parameter space, such as the first principal components of the stochastic gradient descent (SGD) trajectory, which contain diverse sets of high performing models. In these subspaces, we are able to apply elliptical slice sampling and variational inference, which struggle in the full parameter space. We show that Bayesian model averaging over the induced posterior in these subspaces produces accurate predictions and well-calibrated predictive uncertainty for both regression and image classification.
APA
Izmailov, P., Maddox, W.J., Kirichenko, P., Garipov, T., Vetrov, D. & Wilson, A.G.. (2020). Subspace Inference for Bayesian Deep Learning. Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, in Proceedings of Machine Learning Research 115:1169-1179 Available from http://proceedings.mlr.press/v115/izmailov20a.html .

Related Material