Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization

Tyler Johnson, Carlos Guestrin
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1171-1179, 2015.

Abstract

By reducing optimization to a sequence of small subproblems, working set methods achieve fast convergence times for many challenging problems. Despite excellent performance, theoretical understanding of working sets is limited, and implementations often resort to heuristics to determine subproblem size, makeup, and stopping criteria. We propose Blitz, a fast working set algorithm accompanied by useful guarantees. Making no assumptions on data, our theory relates subproblem size to progress toward convergence. This result motivates methods for optimizing algorithmic parameters and discarding irrelevant variables as iterations progress. Applied to L1-regularized learning, Blitz convincingly outperforms existing solvers in sequential, limited-memory, and distributed settings. Blitz is not specific to L1-regularized learning, making the algorithm relevant to many applications involving sparsity or constraints.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-johnson15, title = {Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization}, author = {Johnson, Tyler and Guestrin, Carlos}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1171--1179}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/johnson15.pdf}, url = {https://proceedings.mlr.press/v37/johnson15.html}, abstract = {By reducing optimization to a sequence of small subproblems, working set methods achieve fast convergence times for many challenging problems. Despite excellent performance, theoretical understanding of working sets is limited, and implementations often resort to heuristics to determine subproblem size, makeup, and stopping criteria. We propose Blitz, a fast working set algorithm accompanied by useful guarantees. Making no assumptions on data, our theory relates subproblem size to progress toward convergence. This result motivates methods for optimizing algorithmic parameters and discarding irrelevant variables as iterations progress. Applied to L1-regularized learning, Blitz convincingly outperforms existing solvers in sequential, limited-memory, and distributed settings. Blitz is not specific to L1-regularized learning, making the algorithm relevant to many applications involving sparsity or constraints.} }
Endnote
%0 Conference Paper %T Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization %A Tyler Johnson %A Carlos Guestrin %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-johnson15 %I PMLR %P 1171--1179 %U https://proceedings.mlr.press/v37/johnson15.html %V 37 %X By reducing optimization to a sequence of small subproblems, working set methods achieve fast convergence times for many challenging problems. Despite excellent performance, theoretical understanding of working sets is limited, and implementations often resort to heuristics to determine subproblem size, makeup, and stopping criteria. We propose Blitz, a fast working set algorithm accompanied by useful guarantees. Making no assumptions on data, our theory relates subproblem size to progress toward convergence. This result motivates methods for optimizing algorithmic parameters and discarding irrelevant variables as iterations progress. Applied to L1-regularized learning, Blitz convincingly outperforms existing solvers in sequential, limited-memory, and distributed settings. Blitz is not specific to L1-regularized learning, making the algorithm relevant to many applications involving sparsity or constraints.
RIS
TY - CPAPER TI - Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization AU - Tyler Johnson AU - Carlos Guestrin BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-johnson15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1171 EP - 1179 L1 - http://proceedings.mlr.press/v37/johnson15.pdf UR - https://proceedings.mlr.press/v37/johnson15.html AB - By reducing optimization to a sequence of small subproblems, working set methods achieve fast convergence times for many challenging problems. Despite excellent performance, theoretical understanding of working sets is limited, and implementations often resort to heuristics to determine subproblem size, makeup, and stopping criteria. We propose Blitz, a fast working set algorithm accompanied by useful guarantees. Making no assumptions on data, our theory relates subproblem size to progress toward convergence. This result motivates methods for optimizing algorithmic parameters and discarding irrelevant variables as iterations progress. Applied to L1-regularized learning, Blitz convincingly outperforms existing solvers in sequential, limited-memory, and distributed settings. Blitz is not specific to L1-regularized learning, making the algorithm relevant to many applications involving sparsity or constraints. ER -
APA
Johnson, T. & Guestrin, C.. (2015). Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1171-1179 Available from https://proceedings.mlr.press/v37/johnson15.html.

Related Material