Variational Inference Failures Under Model Symmetries: Permutation Invariant Posteriors for Bayesian Neural Networks

Yoav Gelberg, Tycho F. A. van der Ouderaa, Mark van der Wilk, Yarin Gal
Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), PMLR 251:233-248, 2024.

Abstract

Weight space symmetries in neural network architectures, such as permutation symmetries in MLPs, give rise to Bayesian neural network (BNN) posteriors with many equivalent modes. This multimodality poses a challenge for variational inference (VI) techniques, which typically rely on approximating the posterior with a unimodal distribution. In this work, we investigate the impact of weight space permutation symmetries on VI. We demonstrate, both theoretically and empirically, that these symmetries lead to biases in the approximate posterior, which degrade predictive performance and posterior fit if not explicitly accounted for. To mitigate this behavior, we leverage the symmetric structure of the posterior and devise a symmetrization mechanism for constructing permutation invariant variational posteriors. We show that the symmetrized distribution has a strictly better fit to the true posterior, and that it can be trained using the original ELBO objective with a modified KL regularization term. We demonstrate experimentally that our approach mitigates the aforementioned biases and results in improved predictions and a higher ELBO.

Cite this Paper


BibTeX
@InProceedings{pmlr-v251-gelberg24a, title = {Variational Inference Failures Under Model Symmetries: Permutation Invariant Posteriors for Bayesian Neural Networks}, author = {Gelberg, Yoav and van der Ouderaa, Tycho F. A. and van der Wilk, Mark and Gal, Yarin}, booktitle = {Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM)}, pages = {233--248}, year = {2024}, editor = {Vadgama, Sharvaree and Bekkers, Erik and Pouplin, Alison and Kaba, Sekou-Oumar and Walters, Robin and Lawrence, Hannah and Emerson, Tegan and Kvinge, Henry and Tomczak, Jakub and Jegelka, Stephanie}, volume = {251}, series = {Proceedings of Machine Learning Research}, month = {29 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v251/main/assets/gelberg24a/gelberg24a.pdf}, url = {https://proceedings.mlr.press/v251/gelberg24a.html}, abstract = {Weight space symmetries in neural network architectures, such as permutation symmetries in MLPs, give rise to Bayesian neural network (BNN) posteriors with many equivalent modes. This multimodality poses a challenge for variational inference (VI) techniques, which typically rely on approximating the posterior with a unimodal distribution. In this work, we investigate the impact of weight space permutation symmetries on VI. We demonstrate, both theoretically and empirically, that these symmetries lead to biases in the approximate posterior, which degrade predictive performance and posterior fit if not explicitly accounted for. To mitigate this behavior, we leverage the symmetric structure of the posterior and devise a symmetrization mechanism for constructing permutation invariant variational posteriors. We show that the symmetrized distribution has a strictly better fit to the true posterior, and that it can be trained using the original ELBO objective with a modified KL regularization term. We demonstrate experimentally that our approach mitigates the aforementioned biases and results in improved predictions and a higher ELBO.} }
Endnote
%0 Conference Paper %T Variational Inference Failures Under Model Symmetries: Permutation Invariant Posteriors for Bayesian Neural Networks %A Yoav Gelberg %A Tycho F. A. van der Ouderaa %A Mark van der Wilk %A Yarin Gal %B Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM) %C Proceedings of Machine Learning Research %D 2024 %E Sharvaree Vadgama %E Erik Bekkers %E Alison Pouplin %E Sekou-Oumar Kaba %E Robin Walters %E Hannah Lawrence %E Tegan Emerson %E Henry Kvinge %E Jakub Tomczak %E Stephanie Jegelka %F pmlr-v251-gelberg24a %I PMLR %P 233--248 %U https://proceedings.mlr.press/v251/gelberg24a.html %V 251 %X Weight space symmetries in neural network architectures, such as permutation symmetries in MLPs, give rise to Bayesian neural network (BNN) posteriors with many equivalent modes. This multimodality poses a challenge for variational inference (VI) techniques, which typically rely on approximating the posterior with a unimodal distribution. In this work, we investigate the impact of weight space permutation symmetries on VI. We demonstrate, both theoretically and empirically, that these symmetries lead to biases in the approximate posterior, which degrade predictive performance and posterior fit if not explicitly accounted for. To mitigate this behavior, we leverage the symmetric structure of the posterior and devise a symmetrization mechanism for constructing permutation invariant variational posteriors. We show that the symmetrized distribution has a strictly better fit to the true posterior, and that it can be trained using the original ELBO objective with a modified KL regularization term. We demonstrate experimentally that our approach mitigates the aforementioned biases and results in improved predictions and a higher ELBO.
APA
Gelberg, Y., van der Ouderaa, T.F.A., van der Wilk, M. & Gal, Y.. (2024). Variational Inference Failures Under Model Symmetries: Permutation Invariant Posteriors for Bayesian Neural Networks. Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), in Proceedings of Machine Learning Research 251:233-248 Available from https://proceedings.mlr.press/v251/gelberg24a.html.

Related Material