Federated Linear Contextual Bandits with Heterogeneous Clients

Ethan Blaser, Chuanhao Li, Hongning Wang
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:631-639, 2024.

Abstract

The demand for collaborative and private bandit learning across multiple agents is surging due to the growing quantity of data generated from distributed systems. Federated bandit learning has emerged as a promising framework for private, efficient, and decentralized online learning. However, almost all previous works rely on strong assumptions of client homogeneity, i.e., all participating clients shall share the same bandit model; otherwise, they all would suffer linear regret. This greatly restricts the application of federated bandit learning in practice. In this work, we introduce a new approach for federated bandits for heterogeneous clients, which clusters clients for collaborative bandit learning under the federated learning setting. Our proposed algorithm achieves non-trivial sub-linear regret and communication cost for all clients, subject to the communication protocol under federated learning that at anytime only one model can be shared by the server.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-blaser24a, title = { Federated Linear Contextual Bandits with Heterogeneous Clients }, author = {Blaser, Ethan and Li, Chuanhao and Wang, Hongning}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {631--639}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/blaser24a/blaser24a.pdf}, url = {https://proceedings.mlr.press/v238/blaser24a.html}, abstract = { The demand for collaborative and private bandit learning across multiple agents is surging due to the growing quantity of data generated from distributed systems. Federated bandit learning has emerged as a promising framework for private, efficient, and decentralized online learning. However, almost all previous works rely on strong assumptions of client homogeneity, i.e., all participating clients shall share the same bandit model; otherwise, they all would suffer linear regret. This greatly restricts the application of federated bandit learning in practice. In this work, we introduce a new approach for federated bandits for heterogeneous clients, which clusters clients for collaborative bandit learning under the federated learning setting. Our proposed algorithm achieves non-trivial sub-linear regret and communication cost for all clients, subject to the communication protocol under federated learning that at anytime only one model can be shared by the server. } }
Endnote
%0 Conference Paper %T Federated Linear Contextual Bandits with Heterogeneous Clients %A Ethan Blaser %A Chuanhao Li %A Hongning Wang %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-blaser24a %I PMLR %P 631--639 %U https://proceedings.mlr.press/v238/blaser24a.html %V 238 %X The demand for collaborative and private bandit learning across multiple agents is surging due to the growing quantity of data generated from distributed systems. Federated bandit learning has emerged as a promising framework for private, efficient, and decentralized online learning. However, almost all previous works rely on strong assumptions of client homogeneity, i.e., all participating clients shall share the same bandit model; otherwise, they all would suffer linear regret. This greatly restricts the application of federated bandit learning in practice. In this work, we introduce a new approach for federated bandits for heterogeneous clients, which clusters clients for collaborative bandit learning under the federated learning setting. Our proposed algorithm achieves non-trivial sub-linear regret and communication cost for all clients, subject to the communication protocol under federated learning that at anytime only one model can be shared by the server.
APA
Blaser, E., Li, C. & Wang, H.. (2024). Federated Linear Contextual Bandits with Heterogeneous Clients . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:631-639 Available from https://proceedings.mlr.press/v238/blaser24a.html.

Related Material