Data augmentation in Bayesian neural networks and the cold posterior effect

Seth Nabarro, Stoil Ganev, Adrià Garriga-Alonso, Vincent Fortuin, Mark van der Wilk, Laurence Aitchison
Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, PMLR 180:1434-1444, 2022.

Abstract

Bayesian neural networks that incorporate data augmentation implicitly use a “randomly perturbed log-likelihood [which] does not have a clean interpretation as a valid likelihood function” (Izmailov et al. 2021). Here, we provide several approaches to developing principled Bayesian neural networks incorporating data augmentation. We introduce a “finite orbit” setting which allows valid likelihoods to be computed exactly, and for the more usual “full orbit” setting we derive multi-sample bounds tighter than those used previously. These models cast light on the origin of the cold posterior effect. In particular, we find that the cold posterior effect persists even in these principled models incorporating data augmentation. This suggests that the cold posterior effect cannot be dismissed as an artifact of data augmentation using incorrect likelihoods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v180-nabarro22a, title = {Data augmentation in Bayesian neural networks and the cold posterior effect}, author = {Nabarro, Seth and Ganev, Stoil and Garriga-Alonso, Adri\`a and Fortuin, Vincent and van der Wilk, Mark and Aitchison, Laurence}, booktitle = {Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence}, pages = {1434--1444}, year = {2022}, editor = {Cussens, James and Zhang, Kun}, volume = {180}, series = {Proceedings of Machine Learning Research}, month = {01--05 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v180/nabarro22a/nabarro22a.pdf}, url = {https://proceedings.mlr.press/v180/nabarro22a.html}, abstract = {Bayesian neural networks that incorporate data augmentation implicitly use a “randomly perturbed log-likelihood [which] does not have a clean interpretation as a valid likelihood function” (Izmailov et al. 2021). Here, we provide several approaches to developing principled Bayesian neural networks incorporating data augmentation. We introduce a “finite orbit” setting which allows valid likelihoods to be computed exactly, and for the more usual “full orbit” setting we derive multi-sample bounds tighter than those used previously. These models cast light on the origin of the cold posterior effect. In particular, we find that the cold posterior effect persists even in these principled models incorporating data augmentation. This suggests that the cold posterior effect cannot be dismissed as an artifact of data augmentation using incorrect likelihoods.} }
Endnote
%0 Conference Paper %T Data augmentation in Bayesian neural networks and the cold posterior effect %A Seth Nabarro %A Stoil Ganev %A Adrià Garriga-Alonso %A Vincent Fortuin %A Mark van der Wilk %A Laurence Aitchison %B Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2022 %E James Cussens %E Kun Zhang %F pmlr-v180-nabarro22a %I PMLR %P 1434--1444 %U https://proceedings.mlr.press/v180/nabarro22a.html %V 180 %X Bayesian neural networks that incorporate data augmentation implicitly use a “randomly perturbed log-likelihood [which] does not have a clean interpretation as a valid likelihood function” (Izmailov et al. 2021). Here, we provide several approaches to developing principled Bayesian neural networks incorporating data augmentation. We introduce a “finite orbit” setting which allows valid likelihoods to be computed exactly, and for the more usual “full orbit” setting we derive multi-sample bounds tighter than those used previously. These models cast light on the origin of the cold posterior effect. In particular, we find that the cold posterior effect persists even in these principled models incorporating data augmentation. This suggests that the cold posterior effect cannot be dismissed as an artifact of data augmentation using incorrect likelihoods.
APA
Nabarro, S., Ganev, S., Garriga-Alonso, A., Fortuin, V., van der Wilk, M. & Aitchison, L.. (2022). Data augmentation in Bayesian neural networks and the cold posterior effect. Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 180:1434-1444 Available from https://proceedings.mlr.press/v180/nabarro22a.html.

Related Material