Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression

Divyanshu Vats, Richard Baraniuk
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:948-957, 2014.

Abstract

In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding PaTh, that transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-vats14a, title = {{Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression}}, author = {Vats, Divyanshu and Baraniuk, Richard}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {948--957}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/vats14a.pdf}, url = {https://proceedings.mlr.press/v33/vats14a.html}, abstract = {In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding PaTh, that transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters.} }
Endnote
%0 Conference Paper %T Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression %A Divyanshu Vats %A Richard Baraniuk %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-vats14a %I PMLR %P 948--957 %U https://proceedings.mlr.press/v33/vats14a.html %V 33 %X In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding PaTh, that transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters.
RIS
TY - CPAPER TI - Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression AU - Divyanshu Vats AU - Richard Baraniuk BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-vats14a PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 948 EP - 957 L1 - http://proceedings.mlr.press/v33/vats14a.pdf UR - https://proceedings.mlr.press/v33/vats14a.html AB - In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding PaTh, that transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters. ER -
APA
Vats, D. & Baraniuk, R.. (2014). Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:948-957 Available from https://proceedings.mlr.press/v33/vats14a.html.

Related Material