Noisy Natural Gradient as Variational Inference

Guodong Zhang, Shengyang Sun, David Duvenaud, Roger Grosse
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5852-5861, 2018.

Abstract

Variational Bayesian neural nets combine the flexibility of deep learning with Bayesian uncertainty estimation. Unfortunately, there is a tradeoff between cheap but simple variational families (e.g. fully factorized) or expensive and complicated inference procedures. We show that natural gradient ascent with adaptive weight noise implicitly fits a variational posterior to maximize the evidence lower bound (ELBO). This insight allows us to train full-covariance, fully factorized, or matrix-variate Gaussian variational posteriors using noisy versions of natural gradient, Adam, and K-FAC, respectively, making it possible to scale up to modern-size ConvNets. On standard regression benchmarks, our noisy K-FAC algorithm makes better predictions and matches Hamiltonian Monte Carlo’s predictive variances better than existing methods. Its improved uncertainty estimates lead to more efficient exploration in active learning, and intrinsic motivation for reinforcement learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-zhang18l, title = {Noisy Natural Gradient as Variational Inference}, author = {Zhang, Guodong and Sun, Shengyang and Duvenaud, David and Grosse, Roger}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5852--5861}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/zhang18l/zhang18l.pdf}, url = {https://proceedings.mlr.press/v80/zhang18l.html}, abstract = {Variational Bayesian neural nets combine the flexibility of deep learning with Bayesian uncertainty estimation. Unfortunately, there is a tradeoff between cheap but simple variational families (e.g. fully factorized) or expensive and complicated inference procedures. We show that natural gradient ascent with adaptive weight noise implicitly fits a variational posterior to maximize the evidence lower bound (ELBO). This insight allows us to train full-covariance, fully factorized, or matrix-variate Gaussian variational posteriors using noisy versions of natural gradient, Adam, and K-FAC, respectively, making it possible to scale up to modern-size ConvNets. On standard regression benchmarks, our noisy K-FAC algorithm makes better predictions and matches Hamiltonian Monte Carlo’s predictive variances better than existing methods. Its improved uncertainty estimates lead to more efficient exploration in active learning, and intrinsic motivation for reinforcement learning.} }
Endnote
%0 Conference Paper %T Noisy Natural Gradient as Variational Inference %A Guodong Zhang %A Shengyang Sun %A David Duvenaud %A Roger Grosse %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-zhang18l %I PMLR %P 5852--5861 %U https://proceedings.mlr.press/v80/zhang18l.html %V 80 %X Variational Bayesian neural nets combine the flexibility of deep learning with Bayesian uncertainty estimation. Unfortunately, there is a tradeoff between cheap but simple variational families (e.g. fully factorized) or expensive and complicated inference procedures. We show that natural gradient ascent with adaptive weight noise implicitly fits a variational posterior to maximize the evidence lower bound (ELBO). This insight allows us to train full-covariance, fully factorized, or matrix-variate Gaussian variational posteriors using noisy versions of natural gradient, Adam, and K-FAC, respectively, making it possible to scale up to modern-size ConvNets. On standard regression benchmarks, our noisy K-FAC algorithm makes better predictions and matches Hamiltonian Monte Carlo’s predictive variances better than existing methods. Its improved uncertainty estimates lead to more efficient exploration in active learning, and intrinsic motivation for reinforcement learning.
APA
Zhang, G., Sun, S., Duvenaud, D. & Grosse, R.. (2018). Noisy Natural Gradient as Variational Inference. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:5852-5861 Available from https://proceedings.mlr.press/v80/zhang18l.html.

Related Material