Noisy Natural Gradient as Variational Inference
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:58525861, 2018.
Abstract
Variational Bayesian neural nets combine the flexibility of deep learning with Bayesian uncertainty estimation. Unfortunately, there is a tradeoff between cheap but simple variational families (e.g. fully factorized) or expensive and complicated inference procedures. We show that natural gradient ascent with adaptive weight noise implicitly fits a variational posterior to maximize the evidence lower bound (ELBO). This insight allows us to train fullcovariance, fully factorized, or matrixvariate Gaussian variational posteriors using noisy versions of natural gradient, Adam, and KFAC, respectively, making it possible to scale up to modernsize ConvNets. On standard regression benchmarks, our noisy KFAC algorithm makes better predictions and matches Hamiltonian Monte Carlo’s predictive variances better than existing methods. Its improved uncertainty estimates lead to more efficient exploration in active learning, and intrinsic motivation for reinforcement learning.
Related Material


