Fast Active-set-type Algorithms for $l1$-regularized Linear Regression

Jingu Kim, Haesun Park
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:397-404, 2010.

Abstract

In this paper, we investigate new active-set-type methods for l1-regularized linear regression that overcome some difficulties of existing active set methods. By showing a relationship between $l1$-regularized linear regression and the linear complementarity problem with bounds, we present a fast active-set-type method, called block principal pivoting. This method accelerates computation by allowing exchanges of several variables among working sets. We further provide an improvement of this method, discuss its properties, and also explain a connection to the structure learning of Gaussian graphical models. Experimental comparisons on synthetic and real data sets show that the proposed method is significantly faster than existing active set methods and competitive against recently developed iterative methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-kim10a, title = {Fast Active-set-type Algorithms for L1-regularized Linear Regression}, author = {Kim, Jingu and Park, Haesun}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {397--404}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/kim10a/kim10a.pdf}, url = {https://proceedings.mlr.press/v9/kim10a.html}, abstract = {In this paper, we investigate new active-set-type methods for l1-regularized linear regression that overcome some difficulties of existing active set methods. By showing a relationship between $l1$-regularized linear regression and the linear complementarity problem with bounds, we present a fast active-set-type method, called block principal pivoting. This method accelerates computation by allowing exchanges of several variables among working sets. We further provide an improvement of this method, discuss its properties, and also explain a connection to the structure learning of Gaussian graphical models. Experimental comparisons on synthetic and real data sets show that the proposed method is significantly faster than existing active set methods and competitive against recently developed iterative methods.} }
Endnote
%0 Conference Paper %T Fast Active-set-type Algorithms for $l1$-regularized Linear Regression %A Jingu Kim %A Haesun Park %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-kim10a %I PMLR %P 397--404 %U https://proceedings.mlr.press/v9/kim10a.html %V 9 %X In this paper, we investigate new active-set-type methods for l1-regularized linear regression that overcome some difficulties of existing active set methods. By showing a relationship between $l1$-regularized linear regression and the linear complementarity problem with bounds, we present a fast active-set-type method, called block principal pivoting. This method accelerates computation by allowing exchanges of several variables among working sets. We further provide an improvement of this method, discuss its properties, and also explain a connection to the structure learning of Gaussian graphical models. Experimental comparisons on synthetic and real data sets show that the proposed method is significantly faster than existing active set methods and competitive against recently developed iterative methods.
RIS
TY - CPAPER TI - Fast Active-set-type Algorithms for $l1$-regularized Linear Regression AU - Jingu Kim AU - Haesun Park BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-kim10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 397 EP - 404 L1 - http://proceedings.mlr.press/v9/kim10a/kim10a.pdf UR - https://proceedings.mlr.press/v9/kim10a.html AB - In this paper, we investigate new active-set-type methods for l1-regularized linear regression that overcome some difficulties of existing active set methods. By showing a relationship between $l1$-regularized linear regression and the linear complementarity problem with bounds, we present a fast active-set-type method, called block principal pivoting. This method accelerates computation by allowing exchanges of several variables among working sets. We further provide an improvement of this method, discuss its properties, and also explain a connection to the structure learning of Gaussian graphical models. Experimental comparisons on synthetic and real data sets show that the proposed method is significantly faster than existing active set methods and competitive against recently developed iterative methods. ER -
APA
Kim, J. & Park, H.. (2010). Fast Active-set-type Algorithms for $l1$-regularized Linear Regression. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:397-404 Available from https://proceedings.mlr.press/v9/kim10a.html.

Related Material