[edit]
BayesAdapter: Being Bayesian, Inexpensively and Reliably, via Bayesian Fine-tuning
Proceedings of The 14th Asian Conference on Machine
Learning, PMLR 189:280-295, 2023.
Abstract
Despite their theoretical appealingness, Bayesian
neural networks (BNNs) are left behind in real-world
adoption, mainly due to persistent concerns on their
scalability, accessibility, and reliability. In this
work, we develop the BayesAdapter framework to
relieve these concerns. In particular, we propose to
adapt pre-trained deterministic NNs to be
variational BNNs via cost-effective Bayesian
fine-tuning. Technically, we develop a modularized
implementation for the learning of variational BNNs,
and refurbish the generally applicable exemplar
reparameterization trick through exemplar
parallelization to efficiently reduce the gradient
variance in stochastic variational inference. Based
on the the lightweight Bayesian learning paradigm,
we conduct extensive experiments on a variety of
benchmarks, and show that our method can
consistently induce posteriors with higher quality
than competitive baselines, yet significantly
reducing training overheads.