Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin

Xi-Zhu Wu, Song Liu, Zhi-Hua Zhou
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6840-6849, 2019.

Abstract

Nowadays, many problems require learning a model from data owned by different participants who are restricted to share their examples due to privacy concerns, which is referred to as multiparty learning in the literature. In conventional multiparty learning, a global model is usually trained from scratch via a communication protocol, ignoring the fact that each party may already have a local model trained on her own dataset. In this paper, we define a multiparty multiclass margin to measure the global behavior of a set of heterogeneous local models, and propose a general learning method called HMR (Heterogeneous Model Reuse) to optimize the margin. Our method reuses local models to approximate a global model, even when data are non-i.i.d distributed among parties, by exchanging few examples under predefined budget. Experiments on synthetic and real-world data covering different multiparty scenarios show the effectiveness of our proposal.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-wu19c, title = {Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin}, author = {Wu, Xi-Zhu and Liu, Song and Zhou, Zhi-Hua}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6840--6849}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/wu19c/wu19c.pdf}, url = {https://proceedings.mlr.press/v97/wu19c.html}, abstract = {Nowadays, many problems require learning a model from data owned by different participants who are restricted to share their examples due to privacy concerns, which is referred to as multiparty learning in the literature. In conventional multiparty learning, a global model is usually trained from scratch via a communication protocol, ignoring the fact that each party may already have a local model trained on her own dataset. In this paper, we define a multiparty multiclass margin to measure the global behavior of a set of heterogeneous local models, and propose a general learning method called HMR (Heterogeneous Model Reuse) to optimize the margin. Our method reuses local models to approximate a global model, even when data are non-i.i.d distributed among parties, by exchanging few examples under predefined budget. Experiments on synthetic and real-world data covering different multiparty scenarios show the effectiveness of our proposal.} }
Endnote
%0 Conference Paper %T Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin %A Xi-Zhu Wu %A Song Liu %A Zhi-Hua Zhou %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-wu19c %I PMLR %P 6840--6849 %U https://proceedings.mlr.press/v97/wu19c.html %V 97 %X Nowadays, many problems require learning a model from data owned by different participants who are restricted to share their examples due to privacy concerns, which is referred to as multiparty learning in the literature. In conventional multiparty learning, a global model is usually trained from scratch via a communication protocol, ignoring the fact that each party may already have a local model trained on her own dataset. In this paper, we define a multiparty multiclass margin to measure the global behavior of a set of heterogeneous local models, and propose a general learning method called HMR (Heterogeneous Model Reuse) to optimize the margin. Our method reuses local models to approximate a global model, even when data are non-i.i.d distributed among parties, by exchanging few examples under predefined budget. Experiments on synthetic and real-world data covering different multiparty scenarios show the effectiveness of our proposal.
APA
Wu, X., Liu, S. & Zhou, Z.. (2019). Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6840-6849 Available from https://proceedings.mlr.press/v97/wu19c.html.

Related Material