Safe screening rules for L0-regression from Perspective Relaxations

Alper Atamturk, Andres Gomez
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:421-430, 2020.

Abstract

We give safe screening rules to eliminate variables from regression with $\ell_0$ regularization or cardinality constraint. These rules are based on guarantees that a feature may or may not be selected in an optimal solution. The screening rules can be computed from a convex relaxation solution in linear time, without solving the L0-optimization problem. Thus, they can be used in a preprocessing step to safely remove variables from consideration apriori. Numerical experiments on real and synthetic data indicate that a significant number of the variables can be removed quickly, hence reducing the computational burden for optimization substantially. Therefore, the proposed fast and effective screening rules extend the scope of algorithms for L0-regression to larger data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-atamturk20a, title = {Safe screening rules for L0-regression from Perspective Relaxations}, author = {Atamturk, Alper and Gomez, Andres}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {421--430}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/atamturk20a/atamturk20a.pdf}, url = { http://proceedings.mlr.press/v119/atamturk20a.html }, abstract = {We give safe screening rules to eliminate variables from regression with $\ell_0$ regularization or cardinality constraint. These rules are based on guarantees that a feature may or may not be selected in an optimal solution. The screening rules can be computed from a convex relaxation solution in linear time, without solving the L0-optimization problem. Thus, they can be used in a preprocessing step to safely remove variables from consideration apriori. Numerical experiments on real and synthetic data indicate that a significant number of the variables can be removed quickly, hence reducing the computational burden for optimization substantially. Therefore, the proposed fast and effective screening rules extend the scope of algorithms for L0-regression to larger data sets.} }
Endnote
%0 Conference Paper %T Safe screening rules for L0-regression from Perspective Relaxations %A Alper Atamturk %A Andres Gomez %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-atamturk20a %I PMLR %P 421--430 %U http://proceedings.mlr.press/v119/atamturk20a.html %V 119 %X We give safe screening rules to eliminate variables from regression with $\ell_0$ regularization or cardinality constraint. These rules are based on guarantees that a feature may or may not be selected in an optimal solution. The screening rules can be computed from a convex relaxation solution in linear time, without solving the L0-optimization problem. Thus, they can be used in a preprocessing step to safely remove variables from consideration apriori. Numerical experiments on real and synthetic data indicate that a significant number of the variables can be removed quickly, hence reducing the computational burden for optimization substantially. Therefore, the proposed fast and effective screening rules extend the scope of algorithms for L0-regression to larger data sets.
APA
Atamturk, A. & Gomez, A.. (2020). Safe screening rules for L0-regression from Perspective Relaxations. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:421-430 Available from http://proceedings.mlr.press/v119/atamturk20a.html .

Related Material