Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization

Roy Frostig, Rong Ge, Sham Kakade, Aaron Sidford
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:2540-2548, 2015.

Abstract

We develop a family of accelerated stochastic algorithms that optimize sums of convex functions. Our algorithms improve upon the fastest running time for empirical risk minimization (ERM), and in particular linear least-squares regression, across a wide range of problem settings. To achieve this, we establish a framework, based on the classical proximal point algorithm, useful for accelerating recent fast stochastic algorithms in a black-box fashion. Empirically, we demonstrate that the resulting algorithms exhibit notions of stability that are advantageous in practice. Both in theory and in practice, the provided algorithms reap the computational benefits of adding a large strongly convex regularization term, without incurring a corresponding bias to the original ERM problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-frostig15, title = {Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization}, author = {Frostig, Roy and Ge, Rong and Kakade, Sham and Sidford, Aaron}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {2540--2548}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/frostig15.pdf}, url = {https://proceedings.mlr.press/v37/frostig15.html}, abstract = {We develop a family of accelerated stochastic algorithms that optimize sums of convex functions. Our algorithms improve upon the fastest running time for empirical risk minimization (ERM), and in particular linear least-squares regression, across a wide range of problem settings. To achieve this, we establish a framework, based on the classical proximal point algorithm, useful for accelerating recent fast stochastic algorithms in a black-box fashion. Empirically, we demonstrate that the resulting algorithms exhibit notions of stability that are advantageous in practice. Both in theory and in practice, the provided algorithms reap the computational benefits of adding a large strongly convex regularization term, without incurring a corresponding bias to the original ERM problem.} }
Endnote
%0 Conference Paper %T Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization %A Roy Frostig %A Rong Ge %A Sham Kakade %A Aaron Sidford %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-frostig15 %I PMLR %P 2540--2548 %U https://proceedings.mlr.press/v37/frostig15.html %V 37 %X We develop a family of accelerated stochastic algorithms that optimize sums of convex functions. Our algorithms improve upon the fastest running time for empirical risk minimization (ERM), and in particular linear least-squares regression, across a wide range of problem settings. To achieve this, we establish a framework, based on the classical proximal point algorithm, useful for accelerating recent fast stochastic algorithms in a black-box fashion. Empirically, we demonstrate that the resulting algorithms exhibit notions of stability that are advantageous in practice. Both in theory and in practice, the provided algorithms reap the computational benefits of adding a large strongly convex regularization term, without incurring a corresponding bias to the original ERM problem.
RIS
TY - CPAPER TI - Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization AU - Roy Frostig AU - Rong Ge AU - Sham Kakade AU - Aaron Sidford BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-frostig15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 2540 EP - 2548 L1 - http://proceedings.mlr.press/v37/frostig15.pdf UR - https://proceedings.mlr.press/v37/frostig15.html AB - We develop a family of accelerated stochastic algorithms that optimize sums of convex functions. Our algorithms improve upon the fastest running time for empirical risk minimization (ERM), and in particular linear least-squares regression, across a wide range of problem settings. To achieve this, we establish a framework, based on the classical proximal point algorithm, useful for accelerating recent fast stochastic algorithms in a black-box fashion. Empirically, we demonstrate that the resulting algorithms exhibit notions of stability that are advantageous in practice. Both in theory and in practice, the provided algorithms reap the computational benefits of adding a large strongly convex regularization term, without incurring a corresponding bias to the original ERM problem. ER -
APA
Frostig, R., Ge, R., Kakade, S. & Sidford, A.. (2015). Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:2540-2548 Available from https://proceedings.mlr.press/v37/frostig15.html.

Related Material