Correlated Quantization for Distributed Mean Estimation and Optimization

Ananda Theertha Suresh, Ziteng Sun, Jae Ro, Felix Yu
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:20856-20876, 2022.

Abstract

We study the problem of distributed mean estimation and optimization under communication constraints. We propose a correlated quantization protocol whose error guarantee depends on the deviation of data points instead of their absolute range. The design doesn’t need any prior knowledge on the concentration property of the dataset, which is required to get such dependence in previous works. We show that applying the proposed protocol as a sub-routine in distributed optimization algorithms leads to better convergence rates. We also prove the optimality of our protocol under mild assumptions. Experimental results show that our proposed algorithm outperforms existing mean estimation protocols on a diverse set of tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-suresh22a, title = {Correlated Quantization for Distributed Mean Estimation and Optimization}, author = {Suresh, Ananda Theertha and Sun, Ziteng and Ro, Jae and Yu, Felix}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {20856--20876}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/suresh22a/suresh22a.pdf}, url = {https://proceedings.mlr.press/v162/suresh22a.html}, abstract = {We study the problem of distributed mean estimation and optimization under communication constraints. We propose a correlated quantization protocol whose error guarantee depends on the deviation of data points instead of their absolute range. The design doesn’t need any prior knowledge on the concentration property of the dataset, which is required to get such dependence in previous works. We show that applying the proposed protocol as a sub-routine in distributed optimization algorithms leads to better convergence rates. We also prove the optimality of our protocol under mild assumptions. Experimental results show that our proposed algorithm outperforms existing mean estimation protocols on a diverse set of tasks.} }
Endnote
%0 Conference Paper %T Correlated Quantization for Distributed Mean Estimation and Optimization %A Ananda Theertha Suresh %A Ziteng Sun %A Jae Ro %A Felix Yu %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-suresh22a %I PMLR %P 20856--20876 %U https://proceedings.mlr.press/v162/suresh22a.html %V 162 %X We study the problem of distributed mean estimation and optimization under communication constraints. We propose a correlated quantization protocol whose error guarantee depends on the deviation of data points instead of their absolute range. The design doesn’t need any prior knowledge on the concentration property of the dataset, which is required to get such dependence in previous works. We show that applying the proposed protocol as a sub-routine in distributed optimization algorithms leads to better convergence rates. We also prove the optimality of our protocol under mild assumptions. Experimental results show that our proposed algorithm outperforms existing mean estimation protocols on a diverse set of tasks.
APA
Suresh, A.T., Sun, Z., Ro, J. & Yu, F.. (2022). Correlated Quantization for Distributed Mean Estimation and Optimization. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:20856-20876 Available from https://proceedings.mlr.press/v162/suresh22a.html.

Related Material