A Difference Standardization Method for Mutual Transfer Learning

Haoqing Xu, Meng Wang, Beilun Wang
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:24683-24697, 2022.

Abstract

In many real-world applications, mutual transfer learning is the paradigm that each data domain can potentially be a source or target domain. This is quite different from transfer learning tasks where the source and target are known a priori. However, previous studies about mutual transfer learning either suffer from high computational complexity or oversimplified hypothesis. To overcome these challenges, in this paper, we propose the \underline{Diff}erence \underline{S}tandardization method ({\bf DiffS}) for mutual transfer learning. Specifically, we put forward a novel distance metric between domains, the standardized domain difference, to obtain fast structure recovery and accurate parameter estimation simultaneously. We validate the method’s performance using both synthetic and real-world data. Compared to previous methods, DiffS demonstrates a speed-up of approximately 3000 times that of similar methods and achieves the same accurate learnability structure estimation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-xu22j, title = {A Difference Standardization Method for Mutual Transfer Learning}, author = {Xu, Haoqing and Wang, Meng and Wang, Beilun}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {24683--24697}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/xu22j/xu22j.pdf}, url = {https://proceedings.mlr.press/v162/xu22j.html}, abstract = {In many real-world applications, mutual transfer learning is the paradigm that each data domain can potentially be a source or target domain. This is quite different from transfer learning tasks where the source and target are known a priori. However, previous studies about mutual transfer learning either suffer from high computational complexity or oversimplified hypothesis. To overcome these challenges, in this paper, we propose the \underline{Diff}erence \underline{S}tandardization method ({\bf DiffS}) for mutual transfer learning. Specifically, we put forward a novel distance metric between domains, the standardized domain difference, to obtain fast structure recovery and accurate parameter estimation simultaneously. We validate the method’s performance using both synthetic and real-world data. Compared to previous methods, DiffS demonstrates a speed-up of approximately 3000 times that of similar methods and achieves the same accurate learnability structure estimation.} }
Endnote
%0 Conference Paper %T A Difference Standardization Method for Mutual Transfer Learning %A Haoqing Xu %A Meng Wang %A Beilun Wang %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-xu22j %I PMLR %P 24683--24697 %U https://proceedings.mlr.press/v162/xu22j.html %V 162 %X In many real-world applications, mutual transfer learning is the paradigm that each data domain can potentially be a source or target domain. This is quite different from transfer learning tasks where the source and target are known a priori. However, previous studies about mutual transfer learning either suffer from high computational complexity or oversimplified hypothesis. To overcome these challenges, in this paper, we propose the \underline{Diff}erence \underline{S}tandardization method ({\bf DiffS}) for mutual transfer learning. Specifically, we put forward a novel distance metric between domains, the standardized domain difference, to obtain fast structure recovery and accurate parameter estimation simultaneously. We validate the method’s performance using both synthetic and real-world data. Compared to previous methods, DiffS demonstrates a speed-up of approximately 3000 times that of similar methods and achieves the same accurate learnability structure estimation.
APA
Xu, H., Wang, M. & Wang, B.. (2022). A Difference Standardization Method for Mutual Transfer Learning. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:24683-24697 Available from https://proceedings.mlr.press/v162/xu22j.html.

Related Material