Stochastic Composite Least-Squares Regression with Convergence Rate $O(1/n)$

Nicolas Flammarion, Francis Bach
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:831-875, 2017.

Abstract

We consider the minimization of composite objective functions composed of the expectation of quadratic functions and an arbitrary convex function. We study the stochastic dual averaging algorithm with a constant step-size, showing that it leads to a convergence rate of O(1/n) without strong convexity assumptions. This thus extends earlier results on least-squares regression with the Euclidean geometry to (a) all convex regularizers and constraints, and (b) all geometries represented by a Bregman divergence. This is achieved by a new proof technique that relates stochastic and deterministic recursions

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-flammarion17a, title = {Stochastic Composite Least-Squares Regression with Convergence Rate $O(1/n)$}, author = {Flammarion, Nicolas and Bach, Francis}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {831--875}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/flammarion17a/flammarion17a.pdf}, url = {https://proceedings.mlr.press/v65/flammarion17a.html}, abstract = {We consider the minimization of composite objective functions composed of the expectation of quadratic functions and an arbitrary convex function. We study the stochastic dual averaging algorithm with a constant step-size, showing that it leads to a convergence rate of O(1/n) without strong convexity assumptions. This thus extends earlier results on least-squares regression with the Euclidean geometry to (a) all convex regularizers and constraints, and (b) all geometries represented by a Bregman divergence. This is achieved by a new proof technique that relates stochastic and deterministic recursions} }
Endnote
%0 Conference Paper %T Stochastic Composite Least-Squares Regression with Convergence Rate $O(1/n)$ %A Nicolas Flammarion %A Francis Bach %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-flammarion17a %I PMLR %P 831--875 %U https://proceedings.mlr.press/v65/flammarion17a.html %V 65 %X We consider the minimization of composite objective functions composed of the expectation of quadratic functions and an arbitrary convex function. We study the stochastic dual averaging algorithm with a constant step-size, showing that it leads to a convergence rate of O(1/n) without strong convexity assumptions. This thus extends earlier results on least-squares regression with the Euclidean geometry to (a) all convex regularizers and constraints, and (b) all geometries represented by a Bregman divergence. This is achieved by a new proof technique that relates stochastic and deterministic recursions
APA
Flammarion, N. & Bach, F.. (2017). Stochastic Composite Least-Squares Regression with Convergence Rate $O(1/n)$. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:831-875 Available from https://proceedings.mlr.press/v65/flammarion17a.html.

Related Material