Distribution calibration for regression

Hao Song, Tom Diethe, Meelis Kull, Peter Flach
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:5897-5906, 2019.

Abstract

We are concerned with obtaining well-calibrated output distributions from regression models. Such distributions allow us to quantify the uncertainty that the model has regarding the predicted target value. We introduce the novel concept of distribution calibration, and demonstrate its advantages over the existing definition of quantile calibration. We further propose a post-hoc approach to improving the predictions from previously trained regression models, using multi-output Gaussian Processes with a novel Beta link function. The proposed method is experimentally verified on a set of common regression models and shows improvements for both distribution-level and quantile-level calibration.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-song19a, title = {Distribution calibration for regression}, author = {Song, Hao and Diethe, Tom and Kull, Meelis and Flach, Peter}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {5897--5906}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/song19a/song19a.pdf}, url = {https://proceedings.mlr.press/v97/song19a.html}, abstract = {We are concerned with obtaining well-calibrated output distributions from regression models. Such distributions allow us to quantify the uncertainty that the model has regarding the predicted target value. We introduce the novel concept of distribution calibration, and demonstrate its advantages over the existing definition of quantile calibration. We further propose a post-hoc approach to improving the predictions from previously trained regression models, using multi-output Gaussian Processes with a novel Beta link function. The proposed method is experimentally verified on a set of common regression models and shows improvements for both distribution-level and quantile-level calibration.} }
Endnote
%0 Conference Paper %T Distribution calibration for regression %A Hao Song %A Tom Diethe %A Meelis Kull %A Peter Flach %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-song19a %I PMLR %P 5897--5906 %U https://proceedings.mlr.press/v97/song19a.html %V 97 %X We are concerned with obtaining well-calibrated output distributions from regression models. Such distributions allow us to quantify the uncertainty that the model has regarding the predicted target value. We introduce the novel concept of distribution calibration, and demonstrate its advantages over the existing definition of quantile calibration. We further propose a post-hoc approach to improving the predictions from previously trained regression models, using multi-output Gaussian Processes with a novel Beta link function. The proposed method is experimentally verified on a set of common regression models and shows improvements for both distribution-level and quantile-level calibration.
APA
Song, H., Diethe, T., Kull, M. & Flach, P.. (2019). Distribution calibration for regression. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:5897-5906 Available from https://proceedings.mlr.press/v97/song19a.html.

Related Material