Incorporating functional summary information in Bayesian neural networks using a Dirichlet process likelihood approach

Vishnu Raj, Tianyu Cui, Markus Heinonen, Pekka Marttinen
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:6741-6763, 2023.

Abstract

Bayesian neural networks (BNNs) can account for both aleatoric and epistemic uncertainty. However, in BNNs the priors are often specified over the weights which rarely reflects true prior knowledge in large and complex neural network architectures. We present a simple approach to incorporate prior knowledge in BNNs based on external summary information about the predicted classification probabilities for a given dataset. The available summary information is incorporated as augmented data and modeled with a Dirichlet process, and we derive the corresponding Summary Evidence Lower BOund. The approach is founded on Bayesian principles, and all hyperparameters have a proper probabilistic interpretation. We show how the method can inform the model about task difficulty and class imbalance. Extensive experiments show that, with negligible computational overhead, our method parallels and in many cases outperforms popular alternatives in accuracy, uncertainty calibration, and robustness against corruptions with both balanced and imbalanced data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-raj23a, title = {Incorporating functional summary information in Bayesian neural networks using a Dirichlet process likelihood approach}, author = {Raj, Vishnu and Cui, Tianyu and Heinonen, Markus and Marttinen, Pekka}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {6741--6763}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/raj23a/raj23a.pdf}, url = {https://proceedings.mlr.press/v206/raj23a.html}, abstract = {Bayesian neural networks (BNNs) can account for both aleatoric and epistemic uncertainty. However, in BNNs the priors are often specified over the weights which rarely reflects true prior knowledge in large and complex neural network architectures. We present a simple approach to incorporate prior knowledge in BNNs based on external summary information about the predicted classification probabilities for a given dataset. The available summary information is incorporated as augmented data and modeled with a Dirichlet process, and we derive the corresponding Summary Evidence Lower BOund. The approach is founded on Bayesian principles, and all hyperparameters have a proper probabilistic interpretation. We show how the method can inform the model about task difficulty and class imbalance. Extensive experiments show that, with negligible computational overhead, our method parallels and in many cases outperforms popular alternatives in accuracy, uncertainty calibration, and robustness against corruptions with both balanced and imbalanced data.} }
Endnote
%0 Conference Paper %T Incorporating functional summary information in Bayesian neural networks using a Dirichlet process likelihood approach %A Vishnu Raj %A Tianyu Cui %A Markus Heinonen %A Pekka Marttinen %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-raj23a %I PMLR %P 6741--6763 %U https://proceedings.mlr.press/v206/raj23a.html %V 206 %X Bayesian neural networks (BNNs) can account for both aleatoric and epistemic uncertainty. However, in BNNs the priors are often specified over the weights which rarely reflects true prior knowledge in large and complex neural network architectures. We present a simple approach to incorporate prior knowledge in BNNs based on external summary information about the predicted classification probabilities for a given dataset. The available summary information is incorporated as augmented data and modeled with a Dirichlet process, and we derive the corresponding Summary Evidence Lower BOund. The approach is founded on Bayesian principles, and all hyperparameters have a proper probabilistic interpretation. We show how the method can inform the model about task difficulty and class imbalance. Extensive experiments show that, with negligible computational overhead, our method parallels and in many cases outperforms popular alternatives in accuracy, uncertainty calibration, and robustness against corruptions with both balanced and imbalanced data.
APA
Raj, V., Cui, T., Heinonen, M. & Marttinen, P.. (2023). Incorporating functional summary information in Bayesian neural networks using a Dirichlet process likelihood approach. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:6741-6763 Available from https://proceedings.mlr.press/v206/raj23a.html.

Related Material