Accuracy-Preserving Calibration via Statistical Modeling on Probability Simplex

Yasushi Esaki, Akihiro Nakamura, Keisuke Kawano, Ryoko Tokuhisa, Takuro Kutsuna
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:1666-1674, 2024.

Abstract

Classification models based on deep neural networks (DNNs) must be calibrated to measure the reliability of predictions. Some recent calibration methods have employed a probabilistic model on the probability simplex. However, these calibration methods cannot preserve the accuracy of pre-trained models, even those with a high classification accuracy. We propose an accuracy-preserving calibration method using the Concrete distribution as the probabilistic model on the probability simplex. We theoretically prove that a DNN model trained on cross-entropy loss has optimality as the parameter of the Concrete distribution. We also propose an efficient method that synthetically generates samples for training probabilistic models on the probability simplex. We demonstrate that the proposed method can outperform previous methods in accuracy-preserving calibration tasks using benchmarks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-esaki24a, title = { Accuracy-Preserving Calibration via Statistical Modeling on Probability Simplex }, author = {Esaki, Yasushi and Nakamura, Akihiro and Kawano, Keisuke and Tokuhisa, Ryoko and Kutsuna, Takuro}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {1666--1674}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/esaki24a/esaki24a.pdf}, url = {https://proceedings.mlr.press/v238/esaki24a.html}, abstract = { Classification models based on deep neural networks (DNNs) must be calibrated to measure the reliability of predictions. Some recent calibration methods have employed a probabilistic model on the probability simplex. However, these calibration methods cannot preserve the accuracy of pre-trained models, even those with a high classification accuracy. We propose an accuracy-preserving calibration method using the Concrete distribution as the probabilistic model on the probability simplex. We theoretically prove that a DNN model trained on cross-entropy loss has optimality as the parameter of the Concrete distribution. We also propose an efficient method that synthetically generates samples for training probabilistic models on the probability simplex. We demonstrate that the proposed method can outperform previous methods in accuracy-preserving calibration tasks using benchmarks. } }
Endnote
%0 Conference Paper %T Accuracy-Preserving Calibration via Statistical Modeling on Probability Simplex %A Yasushi Esaki %A Akihiro Nakamura %A Keisuke Kawano %A Ryoko Tokuhisa %A Takuro Kutsuna %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-esaki24a %I PMLR %P 1666--1674 %U https://proceedings.mlr.press/v238/esaki24a.html %V 238 %X Classification models based on deep neural networks (DNNs) must be calibrated to measure the reliability of predictions. Some recent calibration methods have employed a probabilistic model on the probability simplex. However, these calibration methods cannot preserve the accuracy of pre-trained models, even those with a high classification accuracy. We propose an accuracy-preserving calibration method using the Concrete distribution as the probabilistic model on the probability simplex. We theoretically prove that a DNN model trained on cross-entropy loss has optimality as the parameter of the Concrete distribution. We also propose an efficient method that synthetically generates samples for training probabilistic models on the probability simplex. We demonstrate that the proposed method can outperform previous methods in accuracy-preserving calibration tasks using benchmarks.
APA
Esaki, Y., Nakamura, A., Kawano, K., Tokuhisa, R. & Kutsuna, T.. (2024). Accuracy-Preserving Calibration via Statistical Modeling on Probability Simplex . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:1666-1674 Available from https://proceedings.mlr.press/v238/esaki24a.html.

Related Material