Iterated Vector Fields and Conservatism, with Applications to Federated Learning

Zachary Charles, Keith Rush
Proceedings of The 33rd International Conference on Algorithmic Learning Theory, PMLR 167:130-147, 2022.

Abstract

We study whether iterated vector fields (vector fields composed with themselves) are conservative. We give explicit examples of vector fields for which this self-composition preserves conservatism. Notably, this includes gradient vector fields of loss functions associated to some generalized linear models. In the context of federated learning, we show that when clients have loss functions whose gradient satisfies this condition, federated averaging is equivalent to gradient descent on a surrogate loss function. We leverage this to derive novel convergence results for federated learning. By contrast, we demonstrate that when the client losses violate this property, federated averaging can yield behavior which is fundamentally distinct from centralized optimization. Finally, we discuss theoretical and practical questions our analytical framework raises for federated learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v167-charles22a, title = {Iterated Vector Fields and Conservatism, with Applications to Federated Learning}, author = {Charles, Zachary and Rush, Keith}, booktitle = {Proceedings of The 33rd International Conference on Algorithmic Learning Theory}, pages = {130--147}, year = {2022}, editor = {Dasgupta, Sanjoy and Haghtalab, Nika}, volume = {167}, series = {Proceedings of Machine Learning Research}, month = {29 Mar--01 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v167/charles22a/charles22a.pdf}, url = {https://proceedings.mlr.press/v167/charles22a.html}, abstract = {We study whether iterated vector fields (vector fields composed with themselves) are conservative. We give explicit examples of vector fields for which this self-composition preserves conservatism. Notably, this includes gradient vector fields of loss functions associated to some generalized linear models. In the context of federated learning, we show that when clients have loss functions whose gradient satisfies this condition, federated averaging is equivalent to gradient descent on a surrogate loss function. We leverage this to derive novel convergence results for federated learning. By contrast, we demonstrate that when the client losses violate this property, federated averaging can yield behavior which is fundamentally distinct from centralized optimization. Finally, we discuss theoretical and practical questions our analytical framework raises for federated learning.} }
Endnote
%0 Conference Paper %T Iterated Vector Fields and Conservatism, with Applications to Federated Learning %A Zachary Charles %A Keith Rush %B Proceedings of The 33rd International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Sanjoy Dasgupta %E Nika Haghtalab %F pmlr-v167-charles22a %I PMLR %P 130--147 %U https://proceedings.mlr.press/v167/charles22a.html %V 167 %X We study whether iterated vector fields (vector fields composed with themselves) are conservative. We give explicit examples of vector fields for which this self-composition preserves conservatism. Notably, this includes gradient vector fields of loss functions associated to some generalized linear models. In the context of federated learning, we show that when clients have loss functions whose gradient satisfies this condition, federated averaging is equivalent to gradient descent on a surrogate loss function. We leverage this to derive novel convergence results for federated learning. By contrast, we demonstrate that when the client losses violate this property, federated averaging can yield behavior which is fundamentally distinct from centralized optimization. Finally, we discuss theoretical and practical questions our analytical framework raises for federated learning.
APA
Charles, Z. & Rush, K.. (2022). Iterated Vector Fields and Conservatism, with Applications to Federated Learning. Proceedings of The 33rd International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 167:130-147 Available from https://proceedings.mlr.press/v167/charles22a.html.

Related Material