Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning

Xun Qian, Rustem Islamov, Mher Safaryan, Peter Richtarik
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:680-720, 2022.

Abstract

Recent advances in distributed optimization have shown that Newton-type methods with proper communication compression mechanisms can guarantee fast local rates and low communication cost compared to first order methods. We discover that the communication cost of these methods can be further reduced, sometimes dramatically so, with a surprisingly simple trick: Basis Learn (BL). The idea is to transform the usual representation of the local Hessians via a change of basis in the space of matrices and apply compression tools to the new representation. To demonstrate the potential of using custom bases, we design a new Newton-type method (BL1), which reduces communication cost via both BL technique and bidirectional compression mechanism. Furthermore, we present two alternative extensions (BL2 and BL3) to partial participation to accommodate federated learning applications. We prove local linear and superlinear rates independent of the condition number. Finally, we support our claims with numerical experiments by comparing several first and second order methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-qian22a, title = { Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning }, author = {Qian, Xun and Islamov, Rustem and Safaryan, Mher and Richtarik, Peter}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {680--720}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/qian22a/qian22a.pdf}, url = {https://proceedings.mlr.press/v151/qian22a.html}, abstract = { Recent advances in distributed optimization have shown that Newton-type methods with proper communication compression mechanisms can guarantee fast local rates and low communication cost compared to first order methods. We discover that the communication cost of these methods can be further reduced, sometimes dramatically so, with a surprisingly simple trick: Basis Learn (BL). The idea is to transform the usual representation of the local Hessians via a change of basis in the space of matrices and apply compression tools to the new representation. To demonstrate the potential of using custom bases, we design a new Newton-type method (BL1), which reduces communication cost via both BL technique and bidirectional compression mechanism. Furthermore, we present two alternative extensions (BL2 and BL3) to partial participation to accommodate federated learning applications. We prove local linear and superlinear rates independent of the condition number. Finally, we support our claims with numerical experiments by comparing several first and second order methods. } }
Endnote
%0 Conference Paper %T Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning %A Xun Qian %A Rustem Islamov %A Mher Safaryan %A Peter Richtarik %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-qian22a %I PMLR %P 680--720 %U https://proceedings.mlr.press/v151/qian22a.html %V 151 %X Recent advances in distributed optimization have shown that Newton-type methods with proper communication compression mechanisms can guarantee fast local rates and low communication cost compared to first order methods. We discover that the communication cost of these methods can be further reduced, sometimes dramatically so, with a surprisingly simple trick: Basis Learn (BL). The idea is to transform the usual representation of the local Hessians via a change of basis in the space of matrices and apply compression tools to the new representation. To demonstrate the potential of using custom bases, we design a new Newton-type method (BL1), which reduces communication cost via both BL technique and bidirectional compression mechanism. Furthermore, we present two alternative extensions (BL2 and BL3) to partial participation to accommodate federated learning applications. We prove local linear and superlinear rates independent of the condition number. Finally, we support our claims with numerical experiments by comparing several first and second order methods.
APA
Qian, X., Islamov, R., Safaryan, M. & Richtarik, P.. (2022). Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:680-720 Available from https://proceedings.mlr.press/v151/qian22a.html.

Related Material