MURANA: A Generic Framework for Stochastic Variance-Reduced Optimization

Laurent Condat, Peter Richtarik
Proceedings of Mathematical and Scientific Machine Learning, PMLR 190:81-96, 2022.

Abstract

We propose a generic variance-reduced algorithm, which we call MUltiple RANdomized Algorithm (MURANA), for minimizing a sum of several smooth functions plus a regularizer, in a sequential or distributed manner. Our method is formulated with general stochastic operators, which allow us to model various strategies for reducing the computational complexity. For example, MURANA supports sparse activation of the gradients, and also reduction of the communication load via compression of the update vectors. This versatility allows MURANA to cover many existing randomization mechanisms within a unified framework, which also makes it possible to design new methods as special cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v190-condat22a, title = {MURANA: A Generic Framework for Stochastic Variance-Reduced Optimization}, author = {Condat, Laurent and Richtarik, Peter}, booktitle = {Proceedings of Mathematical and Scientific Machine Learning}, pages = {81--96}, year = {2022}, editor = {Dong, Bin and Li, Qianxiao and Wang, Lei and Xu, Zhi-Qin John}, volume = {190}, series = {Proceedings of Machine Learning Research}, month = {15--17 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v190/condat22a/condat22a.pdf}, url = {https://proceedings.mlr.press/v190/condat22a.html}, abstract = {We propose a generic variance-reduced algorithm, which we call MUltiple RANdomized Algorithm (MURANA), for minimizing a sum of several smooth functions plus a regularizer, in a sequential or distributed manner. Our method is formulated with general stochastic operators, which allow us to model various strategies for reducing the computational complexity. For example, MURANA supports sparse activation of the gradients, and also reduction of the communication load via compression of the update vectors. This versatility allows MURANA to cover many existing randomization mechanisms within a unified framework, which also makes it possible to design new methods as special cases.} }
Endnote
%0 Conference Paper %T MURANA: A Generic Framework for Stochastic Variance-Reduced Optimization %A Laurent Condat %A Peter Richtarik %B Proceedings of Mathematical and Scientific Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Bin Dong %E Qianxiao Li %E Lei Wang %E Zhi-Qin John Xu %F pmlr-v190-condat22a %I PMLR %P 81--96 %U https://proceedings.mlr.press/v190/condat22a.html %V 190 %X We propose a generic variance-reduced algorithm, which we call MUltiple RANdomized Algorithm (MURANA), for minimizing a sum of several smooth functions plus a regularizer, in a sequential or distributed manner. Our method is formulated with general stochastic operators, which allow us to model various strategies for reducing the computational complexity. For example, MURANA supports sparse activation of the gradients, and also reduction of the communication load via compression of the update vectors. This versatility allows MURANA to cover many existing randomization mechanisms within a unified framework, which also makes it possible to design new methods as special cases.
APA
Condat, L. & Richtarik, P.. (2022). MURANA: A Generic Framework for Stochastic Variance-Reduced Optimization. Proceedings of Mathematical and Scientific Machine Learning, in Proceedings of Machine Learning Research 190:81-96 Available from https://proceedings.mlr.press/v190/condat22a.html.

Related Material