Accurate Uncertainties for Deep Learning Using Calibrated Regression

Volodymyr Kuleshov, Nathan Fenner, Stefano Ermon
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:2796-2804, 2018.

Abstract

Methods for reasoning under uncertainty are a key building block of accurate and reliable machine learning systems. Bayesian methods provide a general framework to quantify uncertainty. However, because of model misspecification and the use of approximate inference, Bayesian uncertainty estimates are often inaccurate {—} for example, a 90% credible interval may not contain the true outcome 90% of the time. Here, we propose a simple procedure for calibrating any regression algorithm; when applied to Bayesian and probabilistic models, it is guaranteed to produce calibrated uncertainty estimates given enough data. Our procedure is inspired by Platt scaling and extends previous work on classification. We evaluate this approach on Bayesian linear regression, feedforward, and recurrent neural networks, and find that it consistently outputs well-calibrated credible intervals while improving performance on time series forecasting and model-based reinforcement learning tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-kuleshov18a, title = {Accurate Uncertainties for Deep Learning Using Calibrated Regression}, author = {Kuleshov, Volodymyr and Fenner, Nathan and Ermon, Stefano}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {2796--2804}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/kuleshov18a/kuleshov18a.pdf}, url = {http://proceedings.mlr.press/v80/kuleshov18a.html}, abstract = {Methods for reasoning under uncertainty are a key building block of accurate and reliable machine learning systems. Bayesian methods provide a general framework to quantify uncertainty. However, because of model misspecification and the use of approximate inference, Bayesian uncertainty estimates are often inaccurate {—} for example, a 90% credible interval may not contain the true outcome 90% of the time. Here, we propose a simple procedure for calibrating any regression algorithm; when applied to Bayesian and probabilistic models, it is guaranteed to produce calibrated uncertainty estimates given enough data. Our procedure is inspired by Platt scaling and extends previous work on classification. We evaluate this approach on Bayesian linear regression, feedforward, and recurrent neural networks, and find that it consistently outputs well-calibrated credible intervals while improving performance on time series forecasting and model-based reinforcement learning tasks.} }
Endnote
%0 Conference Paper %T Accurate Uncertainties for Deep Learning Using Calibrated Regression %A Volodymyr Kuleshov %A Nathan Fenner %A Stefano Ermon %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-kuleshov18a %I PMLR %P 2796--2804 %U http://proceedings.mlr.press/v80/kuleshov18a.html %V 80 %X Methods for reasoning under uncertainty are a key building block of accurate and reliable machine learning systems. Bayesian methods provide a general framework to quantify uncertainty. However, because of model misspecification and the use of approximate inference, Bayesian uncertainty estimates are often inaccurate {—} for example, a 90% credible interval may not contain the true outcome 90% of the time. Here, we propose a simple procedure for calibrating any regression algorithm; when applied to Bayesian and probabilistic models, it is guaranteed to produce calibrated uncertainty estimates given enough data. Our procedure is inspired by Platt scaling and extends previous work on classification. We evaluate this approach on Bayesian linear regression, feedforward, and recurrent neural networks, and find that it consistently outputs well-calibrated credible intervals while improving performance on time series forecasting and model-based reinforcement learning tasks.
APA
Kuleshov, V., Fenner, N. & Ermon, S.. (2018). Accurate Uncertainties for Deep Learning Using Calibrated Regression. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:2796-2804 Available from http://proceedings.mlr.press/v80/kuleshov18a.html.

Related Material