Multiplicative Normalizing Flows for Variational Bayesian Neural Networks

Christos Louizos, Max Welling
; Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2218-2227, 2017.

Abstract

We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment the approximate posterior in a variational setting for Bayesian neural networks. We show that through this interpretation it is both efficient and straightforward to improve the approximation by employing normalizing flows while still allowing for local reparametrizations and a tractable lower bound. In experiments we show that with this new approximation we can significantly improve upon classical mean field for Bayesian neural networks on both predictive accuracy as well as predictive uncertainty.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-louizos17a, title = {Multiplicative Normalizing Flows for Variational {B}ayesian Neural Networks}, author = {Christos Louizos and Max Welling}, pages = {2218--2227}, year = {2017}, editor = {Doina Precup and Yee Whye Teh}, volume = {70}, series = {Proceedings of Machine Learning Research}, address = {International Convention Centre, Sydney, Australia}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/louizos17a/louizos17a.pdf}, url = {http://proceedings.mlr.press/v70/louizos17a.html}, abstract = {We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment the approximate posterior in a variational setting for Bayesian neural networks. We show that through this interpretation it is both efficient and straightforward to improve the approximation by employing normalizing flows while still allowing for local reparametrizations and a tractable lower bound. In experiments we show that with this new approximation we can significantly improve upon classical mean field for Bayesian neural networks on both predictive accuracy as well as predictive uncertainty.} }
Endnote
%0 Conference Paper %T Multiplicative Normalizing Flows for Variational Bayesian Neural Networks %A Christos Louizos %A Max Welling %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-louizos17a %I PMLR %J Proceedings of Machine Learning Research %P 2218--2227 %U http://proceedings.mlr.press %V 70 %W PMLR %X We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment the approximate posterior in a variational setting for Bayesian neural networks. We show that through this interpretation it is both efficient and straightforward to improve the approximation by employing normalizing flows while still allowing for local reparametrizations and a tractable lower bound. In experiments we show that with this new approximation we can significantly improve upon classical mean field for Bayesian neural networks on both predictive accuracy as well as predictive uncertainty.
APA
Louizos, C. & Welling, M.. (2017). Multiplicative Normalizing Flows for Variational Bayesian Neural Networks. Proceedings of the 34th International Conference on Machine Learning, in PMLR 70:2218-2227

Related Material