Near-Optimal Bounds for Cross-Validation via Loss Stability

Ravi Kumar, Daniel Lokshtanov, Sergei Vassilvitskii, Andrea Vattani
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):27-35, 2013.

Abstract

Multi-fold cross-validation is an established practice to estimate the error rate of a learning algorithm. Quantifying the variance reduction gains due to cross-validation has been challenging due to the inherent correlations introduced by the folds. In this work we introduce a new and weak measure of stability called \emphloss stability and relate the cross-validation performance to loss stability; we also establish that this relationship is near-optimal. Our work thus quantitatively improves the current best bounds on cross-validation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-kumar13a, title = {Near-Optimal Bounds for Cross-Validation via Loss Stability}, author = {Kumar, Ravi and Lokshtanov, Daniel and Vassilvitskii, Sergei and Vattani, Andrea}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {27--35}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/kumar13a.pdf}, url = {https://proceedings.mlr.press/v28/kumar13a.html}, abstract = {Multi-fold cross-validation is an established practice to estimate the error rate of a learning algorithm. Quantifying the variance reduction gains due to cross-validation has been challenging due to the inherent correlations introduced by the folds. In this work we introduce a new and weak measure of stability called \emphloss stability and relate the cross-validation performance to loss stability; we also establish that this relationship is near-optimal. Our work thus quantitatively improves the current best bounds on cross-validation.} }
Endnote
%0 Conference Paper %T Near-Optimal Bounds for Cross-Validation via Loss Stability %A Ravi Kumar %A Daniel Lokshtanov %A Sergei Vassilvitskii %A Andrea Vattani %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-kumar13a %I PMLR %P 27--35 %U https://proceedings.mlr.press/v28/kumar13a.html %V 28 %N 1 %X Multi-fold cross-validation is an established practice to estimate the error rate of a learning algorithm. Quantifying the variance reduction gains due to cross-validation has been challenging due to the inherent correlations introduced by the folds. In this work we introduce a new and weak measure of stability called \emphloss stability and relate the cross-validation performance to loss stability; we also establish that this relationship is near-optimal. Our work thus quantitatively improves the current best bounds on cross-validation.
RIS
TY - CPAPER TI - Near-Optimal Bounds for Cross-Validation via Loss Stability AU - Ravi Kumar AU - Daniel Lokshtanov AU - Sergei Vassilvitskii AU - Andrea Vattani BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-kumar13a PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 1 SP - 27 EP - 35 L1 - http://proceedings.mlr.press/v28/kumar13a.pdf UR - https://proceedings.mlr.press/v28/kumar13a.html AB - Multi-fold cross-validation is an established practice to estimate the error rate of a learning algorithm. Quantifying the variance reduction gains due to cross-validation has been challenging due to the inherent correlations introduced by the folds. In this work we introduce a new and weak measure of stability called \emphloss stability and relate the cross-validation performance to loss stability; we also establish that this relationship is near-optimal. Our work thus quantitatively improves the current best bounds on cross-validation. ER -
APA
Kumar, R., Lokshtanov, D., Vassilvitskii, S. & Vattani, A.. (2013). Near-Optimal Bounds for Cross-Validation via Loss Stability. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(1):27-35 Available from https://proceedings.mlr.press/v28/kumar13a.html.

Related Material