[edit]
Radial Bayesian Neural Networks: Beyond Discrete Support In Large-Scale Bayesian Deep Learning
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1352-1362, 2020.
Abstract
We propose Radial Bayesian Neural Networks (BNNs): a variational approximate posterior for BNNs which scales well to large models. Unlike scalable Bayesian deep learning methods like deep ensembles that have discrete support (assign exactly zero probability almost everywhere in weight-space) Radial BNNs maintain full support; letting them act as a prior for continual learning and avoiding the a priori implausibility of discrete support. Our method avoids a sampling problem in mean-field variational inference (MFVI) caused by the so-called ’soap-bubble’ pathology of multivariate Gaussians. We show that, unlike MFVI, Radial BNNs are robust to hyperparameters and can be efficiently applied to challenging real-world tasks without needing ad-hoc tweaks and intensive tuning: on a real-world medical imaging task Radial BNNs outperform MC dropout and deep ensembles.