On Riemannian Stochastic Approximation Schemes with Fixed Step-Size

Alain Durmus, Pablo Jiménez, Eric Moulines, Salem SAID
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:1018-1026, 2021.

Abstract

This paper studies fixed step-size stochastic approximation (SA) schemes, including stochastic gradient schemes, in a Riemannian framework. It is motivated by several applications, where geodesics can be computed explicitly, and their use accelerates crude Euclidean methods. A fixed step-size scheme defines a family of time-homogeneous Markov chains, parametrized by the step-size. Here, using this formulation, non-asymptotic performance bounds are derived, under Lyapunov conditions. Then, for any step-size, the corresponding Markov chain is proved to admit a unique stationary distribution, and to be geometrically ergodic. This result gives rise to a family of stationary distributions indexed by the step-size, which is further shown to converge to a Dirac measure, concentrated at the solution of the problem at hand, as the step-size goes to $0$. Finally, the asymptotic rate of this convergence is established, through an asymptotic expansion of the bias, and a central limit theorem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-durmus21a, title = { On Riemannian Stochastic Approximation Schemes with Fixed Step-Size }, author = {Durmus, Alain and Jim{\'e}nez, Pablo and Moulines, Eric and SAID, Salem}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {1018--1026}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/durmus21a/durmus21a.pdf}, url = {https://proceedings.mlr.press/v130/durmus21a.html}, abstract = { This paper studies fixed step-size stochastic approximation (SA) schemes, including stochastic gradient schemes, in a Riemannian framework. It is motivated by several applications, where geodesics can be computed explicitly, and their use accelerates crude Euclidean methods. A fixed step-size scheme defines a family of time-homogeneous Markov chains, parametrized by the step-size. Here, using this formulation, non-asymptotic performance bounds are derived, under Lyapunov conditions. Then, for any step-size, the corresponding Markov chain is proved to admit a unique stationary distribution, and to be geometrically ergodic. This result gives rise to a family of stationary distributions indexed by the step-size, which is further shown to converge to a Dirac measure, concentrated at the solution of the problem at hand, as the step-size goes to $0$. Finally, the asymptotic rate of this convergence is established, through an asymptotic expansion of the bias, and a central limit theorem. } }
Endnote
%0 Conference Paper %T On Riemannian Stochastic Approximation Schemes with Fixed Step-Size %A Alain Durmus %A Pablo Jiménez %A Eric Moulines %A Salem SAID %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-durmus21a %I PMLR %P 1018--1026 %U https://proceedings.mlr.press/v130/durmus21a.html %V 130 %X This paper studies fixed step-size stochastic approximation (SA) schemes, including stochastic gradient schemes, in a Riemannian framework. It is motivated by several applications, where geodesics can be computed explicitly, and their use accelerates crude Euclidean methods. A fixed step-size scheme defines a family of time-homogeneous Markov chains, parametrized by the step-size. Here, using this formulation, non-asymptotic performance bounds are derived, under Lyapunov conditions. Then, for any step-size, the corresponding Markov chain is proved to admit a unique stationary distribution, and to be geometrically ergodic. This result gives rise to a family of stationary distributions indexed by the step-size, which is further shown to converge to a Dirac measure, concentrated at the solution of the problem at hand, as the step-size goes to $0$. Finally, the asymptotic rate of this convergence is established, through an asymptotic expansion of the bias, and a central limit theorem.
APA
Durmus, A., Jiménez, P., Moulines, E. & SAID, S.. (2021). On Riemannian Stochastic Approximation Schemes with Fixed Step-Size . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:1018-1026 Available from https://proceedings.mlr.press/v130/durmus21a.html.

Related Material