A Class of Geometric Structures in Transfer Learning: Minimax Bounds and Optimality

Xuhui Zhang, Jose Blanchet, Soumyadip Ghosh, Mark S. Squillante
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:3794-3820, 2022.

Abstract

We study the problem of transfer learning, observing that previous efforts to understand its information-theoretic limits do not fully exploit the geometric structure of the source and target domains. In contrast, our study first illustrates the benefits of incorporating a natural geometric structure within a linear regression model, which corresponds to the generalized eigenvalue problem formed by the Gram matrices of both domains. We next establish a finite-sample minimax lower bound, propose a refined model interpolation estimator that enjoys a matching upper bound, and then extend our framework to multiple source domains and generalized linear models. Surprisingly, as long as information is available on the distance between the source and target parameters, negative-transfer does not occur. Simulation studies show that our proposed interpolation estimator outperforms state-of-the-art transfer learning methods in both moderate- and high-dimensional settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-zhang22a, title = { A Class of Geometric Structures in Transfer Learning: Minimax Bounds and Optimality }, author = {Zhang, Xuhui and Blanchet, Jose and Ghosh, Soumyadip and Squillante, Mark S.}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {3794--3820}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/zhang22a/zhang22a.pdf}, url = {https://proceedings.mlr.press/v151/zhang22a.html}, abstract = { We study the problem of transfer learning, observing that previous efforts to understand its information-theoretic limits do not fully exploit the geometric structure of the source and target domains. In contrast, our study first illustrates the benefits of incorporating a natural geometric structure within a linear regression model, which corresponds to the generalized eigenvalue problem formed by the Gram matrices of both domains. We next establish a finite-sample minimax lower bound, propose a refined model interpolation estimator that enjoys a matching upper bound, and then extend our framework to multiple source domains and generalized linear models. Surprisingly, as long as information is available on the distance between the source and target parameters, negative-transfer does not occur. Simulation studies show that our proposed interpolation estimator outperforms state-of-the-art transfer learning methods in both moderate- and high-dimensional settings. } }
Endnote
%0 Conference Paper %T A Class of Geometric Structures in Transfer Learning: Minimax Bounds and Optimality %A Xuhui Zhang %A Jose Blanchet %A Soumyadip Ghosh %A Mark S. Squillante %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-zhang22a %I PMLR %P 3794--3820 %U https://proceedings.mlr.press/v151/zhang22a.html %V 151 %X We study the problem of transfer learning, observing that previous efforts to understand its information-theoretic limits do not fully exploit the geometric structure of the source and target domains. In contrast, our study first illustrates the benefits of incorporating a natural geometric structure within a linear regression model, which corresponds to the generalized eigenvalue problem formed by the Gram matrices of both domains. We next establish a finite-sample minimax lower bound, propose a refined model interpolation estimator that enjoys a matching upper bound, and then extend our framework to multiple source domains and generalized linear models. Surprisingly, as long as information is available on the distance between the source and target parameters, negative-transfer does not occur. Simulation studies show that our proposed interpolation estimator outperforms state-of-the-art transfer learning methods in both moderate- and high-dimensional settings.
APA
Zhang, X., Blanchet, J., Ghosh, S. & Squillante, M.S.. (2022). A Class of Geometric Structures in Transfer Learning: Minimax Bounds and Optimality . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:3794-3820 Available from https://proceedings.mlr.press/v151/zhang22a.html.

Related Material