Projection onto Minkowski Sums with Application to Constrained Learning

Joong-Ho Won, Jason Xu, Kenneth Lange
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:3642-3651, 2019.

Abstract

We introduce block descent algorithms for projecting onto Minkowski sums of sets. Projection onto such sets is a crucial step in many statistical learning problems, and may regularize complexity of solutions to an optimization problem or arise in dual formulations of penalty methods. We show that projecting onto the Minkowski sum admits simple, efficient algorithms when complications such as overlapping constraints pose challenges to existing methods. We prove that our algorithm converges linearly when sets are strongly convex or satisfy an error bound condition, and extend the theory and methods to encompass non-convex sets as well. We demonstrate empirical advantages in runtime and accuracy over competitors in applications to $\ell_{1,p}$-regularized learning, constrained lasso, and overlapping group lasso.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-lange19a, title = {Projection onto {M}inkowski Sums with Application to Constrained Learning}, author = {Lange, Kenneth and Won, Joong-Ho and Xu, Jason}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {3642--3651}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/lange19a/lange19a.pdf}, url = {https://proceedings.mlr.press/v97/lange19a.html}, abstract = {We introduce block descent algorithms for projecting onto Minkowski sums of sets. Projection onto such sets is a crucial step in many statistical learning problems, and may regularize complexity of solutions to an optimization problem or arise in dual formulations of penalty methods. We show that projecting onto the Minkowski sum admits simple, efficient algorithms when complications such as overlapping constraints pose challenges to existing methods. We prove that our algorithm converges linearly when sets are strongly convex or satisfy an error bound condition, and extend the theory and methods to encompass non-convex sets as well. We demonstrate empirical advantages in runtime and accuracy over competitors in applications to $\ell_{1,p}$-regularized learning, constrained lasso, and overlapping group lasso.} }
Endnote
%0 Conference Paper %T Projection onto Minkowski Sums with Application to Constrained Learning %A Joong-Ho Won %A Jason Xu %A Kenneth Lange %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-lange19a %I PMLR %P 3642--3651 %U https://proceedings.mlr.press/v97/lange19a.html %V 97 %X We introduce block descent algorithms for projecting onto Minkowski sums of sets. Projection onto such sets is a crucial step in many statistical learning problems, and may regularize complexity of solutions to an optimization problem or arise in dual formulations of penalty methods. We show that projecting onto the Minkowski sum admits simple, efficient algorithms when complications such as overlapping constraints pose challenges to existing methods. We prove that our algorithm converges linearly when sets are strongly convex or satisfy an error bound condition, and extend the theory and methods to encompass non-convex sets as well. We demonstrate empirical advantages in runtime and accuracy over competitors in applications to $\ell_{1,p}$-regularized learning, constrained lasso, and overlapping group lasso.
APA
Won, J., Xu, J. & Lange, K.. (2019). Projection onto Minkowski Sums with Application to Constrained Learning. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:3642-3651 Available from https://proceedings.mlr.press/v97/lange19a.html.

Related Material