Bayesian Nonparametric Federated Learning of Neural Networks

Mikhail Yurochkin, Mayank Agarwal, Soumya Ghosh, Kristjan Greenewald, Nghia Hoang, Yasaman Khazaeni
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7252-7261, 2019.

Abstract

In federated learning problems, data is scattered across different servers and exchanging or pooling it is often impractical or prohibited. We develop a Bayesian nonparametric framework for federated learning with neural networks. Each data server is assumed to provide local neural network weights, which are modeled through our framework. We then develop an inference approach that allows us to synthesize a more expressive global network without additional supervision, data pooling and with as few as a single communication round. We then demonstrate the efficacy of our approach on federated learning problems simulated from two popular image classification datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-yurochkin19a, title = {{B}ayesian Nonparametric Federated Learning of Neural Networks}, author = {Yurochkin, Mikhail and Agarwal, Mayank and Ghosh, Soumya and Greenewald, Kristjan and Hoang, Nghia and Khazaeni, Yasaman}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7252--7261}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/yurochkin19a/yurochkin19a.pdf}, url = {https://proceedings.mlr.press/v97/yurochkin19a.html}, abstract = {In federated learning problems, data is scattered across different servers and exchanging or pooling it is often impractical or prohibited. We develop a Bayesian nonparametric framework for federated learning with neural networks. Each data server is assumed to provide local neural network weights, which are modeled through our framework. We then develop an inference approach that allows us to synthesize a more expressive global network without additional supervision, data pooling and with as few as a single communication round. We then demonstrate the efficacy of our approach on federated learning problems simulated from two popular image classification datasets.} }
Endnote
%0 Conference Paper %T Bayesian Nonparametric Federated Learning of Neural Networks %A Mikhail Yurochkin %A Mayank Agarwal %A Soumya Ghosh %A Kristjan Greenewald %A Nghia Hoang %A Yasaman Khazaeni %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-yurochkin19a %I PMLR %P 7252--7261 %U https://proceedings.mlr.press/v97/yurochkin19a.html %V 97 %X In federated learning problems, data is scattered across different servers and exchanging or pooling it is often impractical or prohibited. We develop a Bayesian nonparametric framework for federated learning with neural networks. Each data server is assumed to provide local neural network weights, which are modeled through our framework. We then develop an inference approach that allows us to synthesize a more expressive global network without additional supervision, data pooling and with as few as a single communication round. We then demonstrate the efficacy of our approach on federated learning problems simulated from two popular image classification datasets.
APA
Yurochkin, M., Agarwal, M., Ghosh, S., Greenewald, K., Hoang, N. & Khazaeni, Y.. (2019). Bayesian Nonparametric Federated Learning of Neural Networks. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7252-7261 Available from https://proceedings.mlr.press/v97/yurochkin19a.html.

Related Material