A principled framework for the design and analysis of token algorithms

Hadrien Hendrikx
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:470-489, 2023.

Abstract

We consider a decentralized optimization problem, in which n nodes collaborate to optimize a global objective function using local communications only. While many decentralized algorithms focus on gossip communications (pairwise averaging), we consider a different scheme, in which a “token” that contains the current estimate of the model performs a random walk over the network, and updates its model using the local model of the node it is at. Indeed, token algorithms generally benefit from improved communication efficiency and privacy guarantees. We frame the token algorithm as a randomized gossip algorithm on a conceptual graph, which allows us to prove a series of convergence results for variance-reduced and accelerated token algorithms for the complete graph. We also extend these results to the case of multiple tokens by extending the conceptual graph, and to general graphs by tweaking the communication procedure. The reduction from token to well-studied gossip algorithms leads to tight rates for many token algorithms, and we illustrate their performance empirically.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-hendrikx23a, title = {A principled framework for the design and analysis of token algorithms}, author = {Hendrikx, Hadrien}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {470--489}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/hendrikx23a/hendrikx23a.pdf}, url = {https://proceedings.mlr.press/v206/hendrikx23a.html}, abstract = {We consider a decentralized optimization problem, in which n nodes collaborate to optimize a global objective function using local communications only. While many decentralized algorithms focus on gossip communications (pairwise averaging), we consider a different scheme, in which a “token” that contains the current estimate of the model performs a random walk over the network, and updates its model using the local model of the node it is at. Indeed, token algorithms generally benefit from improved communication efficiency and privacy guarantees. We frame the token algorithm as a randomized gossip algorithm on a conceptual graph, which allows us to prove a series of convergence results for variance-reduced and accelerated token algorithms for the complete graph. We also extend these results to the case of multiple tokens by extending the conceptual graph, and to general graphs by tweaking the communication procedure. The reduction from token to well-studied gossip algorithms leads to tight rates for many token algorithms, and we illustrate their performance empirically.} }
Endnote
%0 Conference Paper %T A principled framework for the design and analysis of token algorithms %A Hadrien Hendrikx %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-hendrikx23a %I PMLR %P 470--489 %U https://proceedings.mlr.press/v206/hendrikx23a.html %V 206 %X We consider a decentralized optimization problem, in which n nodes collaborate to optimize a global objective function using local communications only. While many decentralized algorithms focus on gossip communications (pairwise averaging), we consider a different scheme, in which a “token” that contains the current estimate of the model performs a random walk over the network, and updates its model using the local model of the node it is at. Indeed, token algorithms generally benefit from improved communication efficiency and privacy guarantees. We frame the token algorithm as a randomized gossip algorithm on a conceptual graph, which allows us to prove a series of convergence results for variance-reduced and accelerated token algorithms for the complete graph. We also extend these results to the case of multiple tokens by extending the conceptual graph, and to general graphs by tweaking the communication procedure. The reduction from token to well-studied gossip algorithms leads to tight rates for many token algorithms, and we illustrate their performance empirically.
APA
Hendrikx, H.. (2023). A principled framework for the design and analysis of token algorithms. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:470-489 Available from https://proceedings.mlr.press/v206/hendrikx23a.html.

Related Material