Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI

Theodore Papamarkou, Maria Skoularidou, Konstantina Palla, Laurence Aitchison, Julyan Arbel, David Dunson, Maurizio Filippone, Vincent Fortuin, Philipp Hennig, José Miguel Hernández-Lobato, Aliaksandr Hubin, Alexander Immer, Theofanis Karaletsos, Mohammad Emtiyaz Khan, Agustinus Kristiadi, Yingzhen Li, Stephan Mandt, Christopher Nemeth, Michael A Osborne, Tim G. J. Rudner, David Rügamer, Yee Whye Teh, Max Welling, Andrew Gordon Wilson, Ruqi Zhang
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:39556-39586, 2024.

Abstract

In the current landscape of deep learning research, there is a predominant emphasis on achieving high predictive accuracy in supervised tasks involving large image and language datasets. However, a broader perspective reveals a multitude of overlooked metrics, tasks, and data types, such as uncertainty, active and continual learning, and scientific data, that demand attention. Bayesian deep learning (BDL) constitutes a promising avenue, offering advantages across these diverse settings. This paper posits that BDL can elevate the capabilities of deep learning. It revisits the strengths of BDL, acknowledges existing challenges, and highlights some exciting research avenues aimed at addressing these obstacles. Looking ahead, the discussion focuses on possible ways to combine large-scale foundation models with BDL to unlock their full potential.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-papamarkou24b, title = {Position: {B}ayesian Deep Learning is Needed in the Age of Large-Scale {AI}}, author = {Papamarkou, Theodore and Skoularidou, Maria and Palla, Konstantina and Aitchison, Laurence and Arbel, Julyan and Dunson, David and Filippone, Maurizio and Fortuin, Vincent and Hennig, Philipp and Hern\'{a}ndez-Lobato, Jos\'{e} Miguel and Hubin, Aliaksandr and Immer, Alexander and Karaletsos, Theofanis and Khan, Mohammad Emtiyaz and Kristiadi, Agustinus and Li, Yingzhen and Mandt, Stephan and Nemeth, Christopher and Osborne, Michael A and Rudner, Tim G. J. and R\"{u}gamer, David and Teh, Yee Whye and Welling, Max and Wilson, Andrew Gordon and Zhang, Ruqi}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {39556--39586}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/papamarkou24b/papamarkou24b.pdf}, url = {https://proceedings.mlr.press/v235/papamarkou24b.html}, abstract = {In the current landscape of deep learning research, there is a predominant emphasis on achieving high predictive accuracy in supervised tasks involving large image and language datasets. However, a broader perspective reveals a multitude of overlooked metrics, tasks, and data types, such as uncertainty, active and continual learning, and scientific data, that demand attention. Bayesian deep learning (BDL) constitutes a promising avenue, offering advantages across these diverse settings. This paper posits that BDL can elevate the capabilities of deep learning. It revisits the strengths of BDL, acknowledges existing challenges, and highlights some exciting research avenues aimed at addressing these obstacles. Looking ahead, the discussion focuses on possible ways to combine large-scale foundation models with BDL to unlock their full potential.} }
Endnote
%0 Conference Paper %T Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI %A Theodore Papamarkou %A Maria Skoularidou %A Konstantina Palla %A Laurence Aitchison %A Julyan Arbel %A David Dunson %A Maurizio Filippone %A Vincent Fortuin %A Philipp Hennig %A José Miguel Hernández-Lobato %A Aliaksandr Hubin %A Alexander Immer %A Theofanis Karaletsos %A Mohammad Emtiyaz Khan %A Agustinus Kristiadi %A Yingzhen Li %A Stephan Mandt %A Christopher Nemeth %A Michael A Osborne %A Tim G. J. Rudner %A David Rügamer %A Yee Whye Teh %A Max Welling %A Andrew Gordon Wilson %A Ruqi Zhang %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-papamarkou24b %I PMLR %P 39556--39586 %U https://proceedings.mlr.press/v235/papamarkou24b.html %V 235 %X In the current landscape of deep learning research, there is a predominant emphasis on achieving high predictive accuracy in supervised tasks involving large image and language datasets. However, a broader perspective reveals a multitude of overlooked metrics, tasks, and data types, such as uncertainty, active and continual learning, and scientific data, that demand attention. Bayesian deep learning (BDL) constitutes a promising avenue, offering advantages across these diverse settings. This paper posits that BDL can elevate the capabilities of deep learning. It revisits the strengths of BDL, acknowledges existing challenges, and highlights some exciting research avenues aimed at addressing these obstacles. Looking ahead, the discussion focuses on possible ways to combine large-scale foundation models with BDL to unlock their full potential.
APA
Papamarkou, T., Skoularidou, M., Palla, K., Aitchison, L., Arbel, J., Dunson, D., Filippone, M., Fortuin, V., Hennig, P., Hernández-Lobato, J.M., Hubin, A., Immer, A., Karaletsos, T., Khan, M.E., Kristiadi, A., Li, Y., Mandt, S., Nemeth, C., Osborne, M.A., Rudner, T.G.J., Rügamer, D., Teh, Y.W., Welling, M., Wilson, A.G. & Zhang, R.. (2024). Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:39556-39586 Available from https://proceedings.mlr.press/v235/papamarkou24b.html.

Related Material