A General Framework for Structured Sparsity via Proximal Optimization

Luca Baldassarre, Jean Morales, Andreas Argyriou, Massimiliano Pontil
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:82-90, 2012.

Abstract

We study a generalized framework for structured sparsity. It extends the well-known methods of Lasso and Group Lasso by incorporating additional constraints on the variables as part of a convex optimization problem. This framework provides a straightforward way of favoring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others. Available optimization methods are limited to specific constraint sets and tend to not scale well with sample size and dimensionality. We propose a first order proximal method, which builds upon results on fixed points and successive approximations. The algorithm can be applied to a general class of conic and norm constraints sets and relies on a proximity operator subproblem which can be computed numerically. Experiments on different regression problems demonstrate state-of-the art statistical performance, which improves over Lasso, Group Lasso and StructOMP. They also demonstrate the efficiency of the optimization algorithm and its scalability with the size of the problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-baldassarre12, title = {A General Framework for Structured Sparsity via Proximal Optimization}, author = {Baldassarre, Luca and Morales, Jean and Argyriou, Andreas and Pontil, Massimiliano}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {82--90}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/baldassarre12/baldassarre12.pdf}, url = {https://proceedings.mlr.press/v22/baldassarre12.html}, abstract = {We study a generalized framework for structured sparsity. It extends the well-known methods of Lasso and Group Lasso by incorporating additional constraints on the variables as part of a convex optimization problem. This framework provides a straightforward way of favoring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others. Available optimization methods are limited to specific constraint sets and tend to not scale well with sample size and dimensionality. We propose a first order proximal method, which builds upon results on fixed points and successive approximations. The algorithm can be applied to a general class of conic and norm constraints sets and relies on a proximity operator subproblem which can be computed numerically. Experiments on different regression problems demonstrate state-of-the art statistical performance, which improves over Lasso, Group Lasso and StructOMP. They also demonstrate the efficiency of the optimization algorithm and its scalability with the size of the problem.} }
Endnote
%0 Conference Paper %T A General Framework for Structured Sparsity via Proximal Optimization %A Luca Baldassarre %A Jean Morales %A Andreas Argyriou %A Massimiliano Pontil %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-baldassarre12 %I PMLR %P 82--90 %U https://proceedings.mlr.press/v22/baldassarre12.html %V 22 %X We study a generalized framework for structured sparsity. It extends the well-known methods of Lasso and Group Lasso by incorporating additional constraints on the variables as part of a convex optimization problem. This framework provides a straightforward way of favoring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others. Available optimization methods are limited to specific constraint sets and tend to not scale well with sample size and dimensionality. We propose a first order proximal method, which builds upon results on fixed points and successive approximations. The algorithm can be applied to a general class of conic and norm constraints sets and relies on a proximity operator subproblem which can be computed numerically. Experiments on different regression problems demonstrate state-of-the art statistical performance, which improves over Lasso, Group Lasso and StructOMP. They also demonstrate the efficiency of the optimization algorithm and its scalability with the size of the problem.
RIS
TY - CPAPER TI - A General Framework for Structured Sparsity via Proximal Optimization AU - Luca Baldassarre AU - Jean Morales AU - Andreas Argyriou AU - Massimiliano Pontil BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-baldassarre12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 82 EP - 90 L1 - http://proceedings.mlr.press/v22/baldassarre12/baldassarre12.pdf UR - https://proceedings.mlr.press/v22/baldassarre12.html AB - We study a generalized framework for structured sparsity. It extends the well-known methods of Lasso and Group Lasso by incorporating additional constraints on the variables as part of a convex optimization problem. This framework provides a straightforward way of favoring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others. Available optimization methods are limited to specific constraint sets and tend to not scale well with sample size and dimensionality. We propose a first order proximal method, which builds upon results on fixed points and successive approximations. The algorithm can be applied to a general class of conic and norm constraints sets and relies on a proximity operator subproblem which can be computed numerically. Experiments on different regression problems demonstrate state-of-the art statistical performance, which improves over Lasso, Group Lasso and StructOMP. They also demonstrate the efficiency of the optimization algorithm and its scalability with the size of the problem. ER -
APA
Baldassarre, L., Morales, J., Argyriou, A. & Pontil, M.. (2012). A General Framework for Structured Sparsity via Proximal Optimization. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:82-90 Available from https://proceedings.mlr.press/v22/baldassarre12.html.

Related Material