Level-Set Methods for Finite-Sum Constrained Convex Optimization

Qihang Lin, Runchao Ma, Tianbao Yang
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3112-3121, 2018.

Abstract

We consider the constrained optimization where the objective function and the constraints are defined as summation of finitely many loss functions. This model has applications in machine learning such as Neyman-Pearson classification. We consider two level-set methods to solve this class of problems, an existing inexact Newton method and a new feasible level-set method. To update the level parameter towards the optimality, both methods require an oracle that generates upper and lower bounds as well as an affine-minorant of the level function. To construct the desired oracle, we reformulate the level function as the value of a saddle-point problem using the conjugate and perspective of the loss functions. Then a stochastic variance-reduced gradient method with a special Bregman divergence is proposed as the oracle for solving that saddle-point problem. The special divergence ensures the proximal mapping in each iteration can be solved in a closed form. The total complexity of both level-set methods using the proposed oracle are analyzed.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-lin18c, title = {Level-Set Methods for Finite-Sum Constrained Convex Optimization}, author = {Lin, Qihang and Ma, Runchao and Yang, Tianbao}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3112--3121}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/lin18c/lin18c.pdf}, url = {https://proceedings.mlr.press/v80/lin18c.html}, abstract = {We consider the constrained optimization where the objective function and the constraints are defined as summation of finitely many loss functions. This model has applications in machine learning such as Neyman-Pearson classification. We consider two level-set methods to solve this class of problems, an existing inexact Newton method and a new feasible level-set method. To update the level parameter towards the optimality, both methods require an oracle that generates upper and lower bounds as well as an affine-minorant of the level function. To construct the desired oracle, we reformulate the level function as the value of a saddle-point problem using the conjugate and perspective of the loss functions. Then a stochastic variance-reduced gradient method with a special Bregman divergence is proposed as the oracle for solving that saddle-point problem. The special divergence ensures the proximal mapping in each iteration can be solved in a closed form. The total complexity of both level-set methods using the proposed oracle are analyzed.} }
Endnote
%0 Conference Paper %T Level-Set Methods for Finite-Sum Constrained Convex Optimization %A Qihang Lin %A Runchao Ma %A Tianbao Yang %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-lin18c %I PMLR %P 3112--3121 %U https://proceedings.mlr.press/v80/lin18c.html %V 80 %X We consider the constrained optimization where the objective function and the constraints are defined as summation of finitely many loss functions. This model has applications in machine learning such as Neyman-Pearson classification. We consider two level-set methods to solve this class of problems, an existing inexact Newton method and a new feasible level-set method. To update the level parameter towards the optimality, both methods require an oracle that generates upper and lower bounds as well as an affine-minorant of the level function. To construct the desired oracle, we reformulate the level function as the value of a saddle-point problem using the conjugate and perspective of the loss functions. Then a stochastic variance-reduced gradient method with a special Bregman divergence is proposed as the oracle for solving that saddle-point problem. The special divergence ensures the proximal mapping in each iteration can be solved in a closed form. The total complexity of both level-set methods using the proposed oracle are analyzed.
APA
Lin, Q., Ma, R. & Yang, T.. (2018). Level-Set Methods for Finite-Sum Constrained Convex Optimization. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3112-3121 Available from https://proceedings.mlr.press/v80/lin18c.html.

Related Material