The lasso, persistence, and cross-validation

Darren Homrighausen, Daniel McDonald
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):1031-1039, 2013.

Abstract

During the last fifteen years, the lasso procedure has been the target of a substantial amount of theoretical and applied research. Correspondingly, many results are known about its behavior for a fixed or optimally chosen smoothing parameter (given up to unknown constants). Much less, however, is known about the lasso’s behavior when the smoothing parameter is chosen in a data dependent way. To this end, we give the first result about the risk consistency of lasso when the smoothing parameter is chosen via cross-validation. We consider the high-dimensional setting wherein the number of predictors p=n^α, α>0 grows with the number of observations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-homrighausen13, title = {The lasso, persistence, and cross-validation}, author = {Homrighausen, Darren and McDonald, Daniel}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {1031--1039}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/homrighausen13.pdf}, url = {https://proceedings.mlr.press/v28/homrighausen13.html}, abstract = {During the last fifteen years, the lasso procedure has been the target of a substantial amount of theoretical and applied research. Correspondingly, many results are known about its behavior for a fixed or optimally chosen smoothing parameter (given up to unknown constants). Much less, however, is known about the lasso’s behavior when the smoothing parameter is chosen in a data dependent way. To this end, we give the first result about the risk consistency of lasso when the smoothing parameter is chosen via cross-validation. We consider the high-dimensional setting wherein the number of predictors p=n^α, α>0 grows with the number of observations. } }
Endnote
%0 Conference Paper %T The lasso, persistence, and cross-validation %A Darren Homrighausen %A Daniel McDonald %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-homrighausen13 %I PMLR %P 1031--1039 %U https://proceedings.mlr.press/v28/homrighausen13.html %V 28 %N 3 %X During the last fifteen years, the lasso procedure has been the target of a substantial amount of theoretical and applied research. Correspondingly, many results are known about its behavior for a fixed or optimally chosen smoothing parameter (given up to unknown constants). Much less, however, is known about the lasso’s behavior when the smoothing parameter is chosen in a data dependent way. To this end, we give the first result about the risk consistency of lasso when the smoothing parameter is chosen via cross-validation. We consider the high-dimensional setting wherein the number of predictors p=n^α, α>0 grows with the number of observations.
RIS
TY - CPAPER TI - The lasso, persistence, and cross-validation AU - Darren Homrighausen AU - Daniel McDonald BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-homrighausen13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 1031 EP - 1039 L1 - http://proceedings.mlr.press/v28/homrighausen13.pdf UR - https://proceedings.mlr.press/v28/homrighausen13.html AB - During the last fifteen years, the lasso procedure has been the target of a substantial amount of theoretical and applied research. Correspondingly, many results are known about its behavior for a fixed or optimally chosen smoothing parameter (given up to unknown constants). Much less, however, is known about the lasso’s behavior when the smoothing parameter is chosen in a data dependent way. To this end, we give the first result about the risk consistency of lasso when the smoothing parameter is chosen via cross-validation. We consider the high-dimensional setting wherein the number of predictors p=n^α, α>0 grows with the number of observations. ER -
APA
Homrighausen, D. & McDonald, D.. (2013). The lasso, persistence, and cross-validation. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):1031-1039 Available from https://proceedings.mlr.press/v28/homrighausen13.html.

Related Material