Estimation Consistency of the Group Lasso and its Applications

Han Liu, Jian Zhang
Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, PMLR 5:376-383, 2009.

Abstract

We extend the $\ell_2$-consistency result of (Meinshausen and Yu 2008) from the Lasso to the group Lasso. Our main theorem shows that the group Lasso achieves estimation consistency under a mild condition and an asymptotic upper bound on the number of selected variables can be obtained. As a result, we can apply the nonnegative garrote procedure to the group Lasso result to obtain an estimator which is simultaneously estimation and variable selection consistent. In particular, our setting allows both the number of groups and the number of variables per group increase and thus is applicable to high-dimensional problems. We also provide estimation consistency analysis for a version of the sparse additive models with increasing dimensions. Some finite-sample results are also reported.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-liu09a, title = {Estimation Consistency of the Group Lasso and its Applications}, author = {Liu, Han and Zhang, Jian}, booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics}, pages = {376--383}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/liu09a/liu09a.pdf}, url = {https://proceedings.mlr.press/v5/liu09a.html}, abstract = {We extend the $\ell_2$-consistency result of (Meinshausen and Yu 2008) from the Lasso to the group Lasso. Our main theorem shows that the group Lasso achieves estimation consistency under a mild condition and an asymptotic upper bound on the number of selected variables can be obtained. As a result, we can apply the nonnegative garrote procedure to the group Lasso result to obtain an estimator which is simultaneously estimation and variable selection consistent. In particular, our setting allows both the number of groups and the number of variables per group increase and thus is applicable to high-dimensional problems. We also provide estimation consistency analysis for a version of the sparse additive models with increasing dimensions. Some finite-sample results are also reported.} }
Endnote
%0 Conference Paper %T Estimation Consistency of the Group Lasso and its Applications %A Han Liu %A Jian Zhang %B Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-liu09a %I PMLR %P 376--383 %U https://proceedings.mlr.press/v5/liu09a.html %V 5 %X We extend the $\ell_2$-consistency result of (Meinshausen and Yu 2008) from the Lasso to the group Lasso. Our main theorem shows that the group Lasso achieves estimation consistency under a mild condition and an asymptotic upper bound on the number of selected variables can be obtained. As a result, we can apply the nonnegative garrote procedure to the group Lasso result to obtain an estimator which is simultaneously estimation and variable selection consistent. In particular, our setting allows both the number of groups and the number of variables per group increase and thus is applicable to high-dimensional problems. We also provide estimation consistency analysis for a version of the sparse additive models with increasing dimensions. Some finite-sample results are also reported.
RIS
TY - CPAPER TI - Estimation Consistency of the Group Lasso and its Applications AU - Han Liu AU - Jian Zhang BT - Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-liu09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 376 EP - 383 L1 - http://proceedings.mlr.press/v5/liu09a/liu09a.pdf UR - https://proceedings.mlr.press/v5/liu09a.html AB - We extend the $\ell_2$-consistency result of (Meinshausen and Yu 2008) from the Lasso to the group Lasso. Our main theorem shows that the group Lasso achieves estimation consistency under a mild condition and an asymptotic upper bound on the number of selected variables can be obtained. As a result, we can apply the nonnegative garrote procedure to the group Lasso result to obtain an estimator which is simultaneously estimation and variable selection consistent. In particular, our setting allows both the number of groups and the number of variables per group increase and thus is applicable to high-dimensional problems. We also provide estimation consistency analysis for a version of the sparse additive models with increasing dimensions. Some finite-sample results are also reported. ER -
APA
Liu, H. & Zhang, J.. (2009). Estimation Consistency of the Group Lasso and its Applications. Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:376-383 Available from https://proceedings.mlr.press/v5/liu09a.html.

Related Material