Retrospective Uncertainties for Deep Models using Vine Copulas

Natasa Tagasovska, Firat Ozdemir, Axel Brando
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:7528-7539, 2023.

Abstract

Despite the major progress of deep models as learning machines, uncertainty estimation remains a major challenge. Existing solutions rely on modified loss functions or architectural changes. We propose to compensate for the lack of built-in uncertainty estimates by supplementing any network, retrospectively, with a subsequent vine copula model, in an overall compound we call Vine-Copula Neural Network (VCNN). Through synthetic and real-data experiments, we show that VCNNs could be task (regression/classification) and architecture (recurrent, fully connected) agnostic while providing reliable and better-calibrated uncertainty estimates, comparable to state-of-the-art built-in uncertainty solutions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-tagasovska23a, title = {Retrospective Uncertainties for Deep Models using Vine Copulas}, author = {Tagasovska, Natasa and Ozdemir, Firat and Brando, Axel}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {7528--7539}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/tagasovska23a/tagasovska23a.pdf}, url = {https://proceedings.mlr.press/v206/tagasovska23a.html}, abstract = {Despite the major progress of deep models as learning machines, uncertainty estimation remains a major challenge. Existing solutions rely on modified loss functions or architectural changes. We propose to compensate for the lack of built-in uncertainty estimates by supplementing any network, retrospectively, with a subsequent vine copula model, in an overall compound we call Vine-Copula Neural Network (VCNN). Through synthetic and real-data experiments, we show that VCNNs could be task (regression/classification) and architecture (recurrent, fully connected) agnostic while providing reliable and better-calibrated uncertainty estimates, comparable to state-of-the-art built-in uncertainty solutions.} }
Endnote
%0 Conference Paper %T Retrospective Uncertainties for Deep Models using Vine Copulas %A Natasa Tagasovska %A Firat Ozdemir %A Axel Brando %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-tagasovska23a %I PMLR %P 7528--7539 %U https://proceedings.mlr.press/v206/tagasovska23a.html %V 206 %X Despite the major progress of deep models as learning machines, uncertainty estimation remains a major challenge. Existing solutions rely on modified loss functions or architectural changes. We propose to compensate for the lack of built-in uncertainty estimates by supplementing any network, retrospectively, with a subsequent vine copula model, in an overall compound we call Vine-Copula Neural Network (VCNN). Through synthetic and real-data experiments, we show that VCNNs could be task (regression/classification) and architecture (recurrent, fully connected) agnostic while providing reliable and better-calibrated uncertainty estimates, comparable to state-of-the-art built-in uncertainty solutions.
APA
Tagasovska, N., Ozdemir, F. & Brando, A.. (2023). Retrospective Uncertainties for Deep Models using Vine Copulas. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:7528-7539 Available from https://proceedings.mlr.press/v206/tagasovska23a.html.

Related Material