Conformal Inference is (almost) Free for Neural Networks Trained with Early Stopping

Ziyi Liang, Yanfei Zhou, Matteo Sesia
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:20810-20851, 2023.

Abstract

Early stopping based on hold-out data is a popular regularization technique designed to mitigate overfitting and increase the predictive accuracy of neural networks. Models trained with early stopping often provide relatively accurate predictions, but they generally still lack precise statistical guarantees unless they are further calibrated using independent hold-out data. This paper addresses the above limitation with conformalized early stopping: a novel method that combines early stopping with conformal calibration while efficiently recycling the same hold-out data. This leads to models that are both accurate and able to provide exact predictive inferences without multiple data splits nor overly conservative adjustments. Practical implementations are developed for different learning tasks—outlier detection, multi-class classification, regression—and their competitive performance is demonstrated on real data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-liang23i, title = {Conformal Inference is (almost) Free for Neural Networks Trained with Early Stopping}, author = {Liang, Ziyi and Zhou, Yanfei and Sesia, Matteo}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {20810--20851}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/liang23i/liang23i.pdf}, url = {https://proceedings.mlr.press/v202/liang23i.html}, abstract = {Early stopping based on hold-out data is a popular regularization technique designed to mitigate overfitting and increase the predictive accuracy of neural networks. Models trained with early stopping often provide relatively accurate predictions, but they generally still lack precise statistical guarantees unless they are further calibrated using independent hold-out data. This paper addresses the above limitation with conformalized early stopping: a novel method that combines early stopping with conformal calibration while efficiently recycling the same hold-out data. This leads to models that are both accurate and able to provide exact predictive inferences without multiple data splits nor overly conservative adjustments. Practical implementations are developed for different learning tasks—outlier detection, multi-class classification, regression—and their competitive performance is demonstrated on real data.} }
Endnote
%0 Conference Paper %T Conformal Inference is (almost) Free for Neural Networks Trained with Early Stopping %A Ziyi Liang %A Yanfei Zhou %A Matteo Sesia %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-liang23i %I PMLR %P 20810--20851 %U https://proceedings.mlr.press/v202/liang23i.html %V 202 %X Early stopping based on hold-out data is a popular regularization technique designed to mitigate overfitting and increase the predictive accuracy of neural networks. Models trained with early stopping often provide relatively accurate predictions, but they generally still lack precise statistical guarantees unless they are further calibrated using independent hold-out data. This paper addresses the above limitation with conformalized early stopping: a novel method that combines early stopping with conformal calibration while efficiently recycling the same hold-out data. This leads to models that are both accurate and able to provide exact predictive inferences without multiple data splits nor overly conservative adjustments. Practical implementations are developed for different learning tasks—outlier detection, multi-class classification, regression—and their competitive performance is demonstrated on real data.
APA
Liang, Z., Zhou, Y. & Sesia, M.. (2023). Conformal Inference is (almost) Free for Neural Networks Trained with Early Stopping. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:20810-20851 Available from https://proceedings.mlr.press/v202/liang23i.html.

Related Material