Elementary Estimators for High-Dimensional Linear Regression

Eunho Yang, Aurelie Lozano, Pradeep Ravikumar
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):388-396, 2014.

Abstract

We consider the problem of structurally constrained high-dimensional linear regression. This has attracted considerable attention over the last decade, with state of the art statistical estimators based on solving regularized convex programs. While these typically non-smooth convex programs can be solved in polynomial time, scaling the state of the art optimization methods to very large-scale problems is an ongoing and rich area of research. In this paper, we attempt to address this scaling issue at the source, by asking whether one can build \emphsimpler possibly closed-form estimators, that yet come with statistical guarantees that are nonetheless comparable to regularized likelihood estimators! We answer this question in the affirmative, with variants of the classical ridge and OLS (ordinary least squares estimators) for linear regression. We analyze our estimators in the high-dimensional setting, and moreover provide empirical corroboration of its performance on simulated as well as real world microarray data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-yangc14, title = {Elementary Estimators for High-Dimensional Linear Regression}, author = {Yang, Eunho and Lozano, Aurelie and Ravikumar, Pradeep}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {388--396}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/yangc14.pdf}, url = {https://proceedings.mlr.press/v32/yangc14.html}, abstract = {We consider the problem of structurally constrained high-dimensional linear regression. This has attracted considerable attention over the last decade, with state of the art statistical estimators based on solving regularized convex programs. While these typically non-smooth convex programs can be solved in polynomial time, scaling the state of the art optimization methods to very large-scale problems is an ongoing and rich area of research. In this paper, we attempt to address this scaling issue at the source, by asking whether one can build \emphsimpler possibly closed-form estimators, that yet come with statistical guarantees that are nonetheless comparable to regularized likelihood estimators! We answer this question in the affirmative, with variants of the classical ridge and OLS (ordinary least squares estimators) for linear regression. We analyze our estimators in the high-dimensional setting, and moreover provide empirical corroboration of its performance on simulated as well as real world microarray data.} }
Endnote
%0 Conference Paper %T Elementary Estimators for High-Dimensional Linear Regression %A Eunho Yang %A Aurelie Lozano %A Pradeep Ravikumar %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-yangc14 %I PMLR %P 388--396 %U https://proceedings.mlr.press/v32/yangc14.html %V 32 %N 2 %X We consider the problem of structurally constrained high-dimensional linear regression. This has attracted considerable attention over the last decade, with state of the art statistical estimators based on solving regularized convex programs. While these typically non-smooth convex programs can be solved in polynomial time, scaling the state of the art optimization methods to very large-scale problems is an ongoing and rich area of research. In this paper, we attempt to address this scaling issue at the source, by asking whether one can build \emphsimpler possibly closed-form estimators, that yet come with statistical guarantees that are nonetheless comparable to regularized likelihood estimators! We answer this question in the affirmative, with variants of the classical ridge and OLS (ordinary least squares estimators) for linear regression. We analyze our estimators in the high-dimensional setting, and moreover provide empirical corroboration of its performance on simulated as well as real world microarray data.
RIS
TY - CPAPER TI - Elementary Estimators for High-Dimensional Linear Regression AU - Eunho Yang AU - Aurelie Lozano AU - Pradeep Ravikumar BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-yangc14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 388 EP - 396 L1 - http://proceedings.mlr.press/v32/yangc14.pdf UR - https://proceedings.mlr.press/v32/yangc14.html AB - We consider the problem of structurally constrained high-dimensional linear regression. This has attracted considerable attention over the last decade, with state of the art statistical estimators based on solving regularized convex programs. While these typically non-smooth convex programs can be solved in polynomial time, scaling the state of the art optimization methods to very large-scale problems is an ongoing and rich area of research. In this paper, we attempt to address this scaling issue at the source, by asking whether one can build \emphsimpler possibly closed-form estimators, that yet come with statistical guarantees that are nonetheless comparable to regularized likelihood estimators! We answer this question in the affirmative, with variants of the classical ridge and OLS (ordinary least squares estimators) for linear regression. We analyze our estimators in the high-dimensional setting, and moreover provide empirical corroboration of its performance on simulated as well as real world microarray data. ER -
APA
Yang, E., Lozano, A. & Ravikumar, P.. (2014). Elementary Estimators for High-Dimensional Linear Regression. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):388-396 Available from https://proceedings.mlr.press/v32/yangc14.html.

Related Material