On Feynman-Kac training of partial Bayesian neural networks

Zheng Zhao, Sebastian Mair, Thomas B. Schön, Jens Sjölund
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3223-3231, 2024.

Abstract

Recently, partial Bayesian neural networks (pBNNs), which only consider a subset of the parameters to be stochastic, were shown to perform competitively with full Bayesian neural networks. However, pBNNs are often multi-modal in the latent variable space and thus challenging to approximate with parametric models. To address this problem, we propose an efficient sampling-based training strategy, wherein the training of a pBNN is formulated as simulating a Feynman-Kac model. We then describe variations of sequential Monte Carlo samplers that allow us to simultaneously estimate the parameters and the latent posterior distribution of this model at a tractable computational cost. Using various synthetic and real-world datasets we show that our proposed training scheme outperforms the state of the art in terms of predictive performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-zhao24b, title = {On {F}eynman-{K}ac training of partial {B}ayesian neural networks}, author = {Zhao, Zheng and Mair, Sebastian and B. Sch\"{o}n, Thomas and Sj\"{o}lund, Jens}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3223--3231}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/zhao24b/zhao24b.pdf}, url = {https://proceedings.mlr.press/v238/zhao24b.html}, abstract = {Recently, partial Bayesian neural networks (pBNNs), which only consider a subset of the parameters to be stochastic, were shown to perform competitively with full Bayesian neural networks. However, pBNNs are often multi-modal in the latent variable space and thus challenging to approximate with parametric models. To address this problem, we propose an efficient sampling-based training strategy, wherein the training of a pBNN is formulated as simulating a Feynman-Kac model. We then describe variations of sequential Monte Carlo samplers that allow us to simultaneously estimate the parameters and the latent posterior distribution of this model at a tractable computational cost. Using various synthetic and real-world datasets we show that our proposed training scheme outperforms the state of the art in terms of predictive performance.} }
Endnote
%0 Conference Paper %T On Feynman-Kac training of partial Bayesian neural networks %A Zheng Zhao %A Sebastian Mair %A Thomas B. Schön %A Jens Sjölund %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-zhao24b %I PMLR %P 3223--3231 %U https://proceedings.mlr.press/v238/zhao24b.html %V 238 %X Recently, partial Bayesian neural networks (pBNNs), which only consider a subset of the parameters to be stochastic, were shown to perform competitively with full Bayesian neural networks. However, pBNNs are often multi-modal in the latent variable space and thus challenging to approximate with parametric models. To address this problem, we propose an efficient sampling-based training strategy, wherein the training of a pBNN is formulated as simulating a Feynman-Kac model. We then describe variations of sequential Monte Carlo samplers that allow us to simultaneously estimate the parameters and the latent posterior distribution of this model at a tractable computational cost. Using various synthetic and real-world datasets we show that our proposed training scheme outperforms the state of the art in terms of predictive performance.
APA
Zhao, Z., Mair, S., B. Schön, T. & Sjölund, J.. (2024). On Feynman-Kac training of partial Bayesian neural networks. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3223-3231 Available from https://proceedings.mlr.press/v238/zhao24b.html.

Related Material