Mutual Transfer Learning for Massive Data

Ching-Wei Cheng, Xingye Qiao, Guang Cheng
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1800-1809, 2020.

Abstract

In the transfer learning problem, the target and the source data domains are typically known. In this article, we study a new paradigm called mutual transfer learning where among many heterogeneous data domains, every data domain could potentially be the target of interest, and it could also be a useful source to help the learning in other data domains. However, it is important to note that given a target not every data domain can be a successful source; only data sets that are similar enough to be thought as from the same population can be useful sources for each other. Under this mutual learnability assumption, a confidence distribution fusion approach is proposed to recover the mutual learnability relation in the transfer learning regime. Our proposed method achieves the same oracle statistical inferential accuracy as if the true learnability structure were known. It can be implemented in an efficient parallel fashion to deal with large-scale data. Simulated and real examples are analyzed to illustrate the usefulness of the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-cheng20d, title = {Mutual Transfer Learning for Massive Data}, author = {Cheng, Ching-Wei and Qiao, Xingye and Cheng, Guang}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1800--1809}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/cheng20d/cheng20d.pdf}, url = {https://proceedings.mlr.press/v119/cheng20d.html}, abstract = {In the transfer learning problem, the target and the source data domains are typically known. In this article, we study a new paradigm called mutual transfer learning where among many heterogeneous data domains, every data domain could potentially be the target of interest, and it could also be a useful source to help the learning in other data domains. However, it is important to note that given a target not every data domain can be a successful source; only data sets that are similar enough to be thought as from the same population can be useful sources for each other. Under this mutual learnability assumption, a confidence distribution fusion approach is proposed to recover the mutual learnability relation in the transfer learning regime. Our proposed method achieves the same oracle statistical inferential accuracy as if the true learnability structure were known. It can be implemented in an efficient parallel fashion to deal with large-scale data. Simulated and real examples are analyzed to illustrate the usefulness of the proposed method.} }
Endnote
%0 Conference Paper %T Mutual Transfer Learning for Massive Data %A Ching-Wei Cheng %A Xingye Qiao %A Guang Cheng %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-cheng20d %I PMLR %P 1800--1809 %U https://proceedings.mlr.press/v119/cheng20d.html %V 119 %X In the transfer learning problem, the target and the source data domains are typically known. In this article, we study a new paradigm called mutual transfer learning where among many heterogeneous data domains, every data domain could potentially be the target of interest, and it could also be a useful source to help the learning in other data domains. However, it is important to note that given a target not every data domain can be a successful source; only data sets that are similar enough to be thought as from the same population can be useful sources for each other. Under this mutual learnability assumption, a confidence distribution fusion approach is proposed to recover the mutual learnability relation in the transfer learning regime. Our proposed method achieves the same oracle statistical inferential accuracy as if the true learnability structure were known. It can be implemented in an efficient parallel fashion to deal with large-scale data. Simulated and real examples are analyzed to illustrate the usefulness of the proposed method.
APA
Cheng, C., Qiao, X. & Cheng, G.. (2020). Mutual Transfer Learning for Massive Data. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1800-1809 Available from https://proceedings.mlr.press/v119/cheng20d.html.

Related Material