XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage

Jae-Jun Lee, Sung Whan Yoon
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3196-3204, 2024.

Abstract

Meta-learning, which pursues an effective initialization model, has emerged as a promising approach to handling unseen tasks. However, a limitation remains to be evident when a meta-learner tries to encompass a wide range of task distribution, e.g., learning across distinctive datasets or domains. Recently, a group of works has attempted to employ multiple model initializations to cover widely-ranging tasks, but they are limited in adaptively expanding initializations. We introduce XB-MAML, which learns expandable basis parameters, where they are linearly combined to form an effective initialization to a given task. XB-MAML observes the discrepancy between the vector space spanned by the basis and fine-tuned parameters to decide whether to expand the basis. Our method surpasses the existing works in the multi-domain meta-learning benchmarks and opens up new chances of meta-learning for obtaining the diverse inductive bias that can be combined to stretch toward the effective initialization for diverse unseen tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-lee24b, title = { {XB-MAML}: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage }, author = {Lee, Jae-Jun and Whan Yoon, Sung}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3196--3204}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/lee24b/lee24b.pdf}, url = {https://proceedings.mlr.press/v238/lee24b.html}, abstract = { Meta-learning, which pursues an effective initialization model, has emerged as a promising approach to handling unseen tasks. However, a limitation remains to be evident when a meta-learner tries to encompass a wide range of task distribution, e.g., learning across distinctive datasets or domains. Recently, a group of works has attempted to employ multiple model initializations to cover widely-ranging tasks, but they are limited in adaptively expanding initializations. We introduce XB-MAML, which learns expandable basis parameters, where they are linearly combined to form an effective initialization to a given task. XB-MAML observes the discrepancy between the vector space spanned by the basis and fine-tuned parameters to decide whether to expand the basis. Our method surpasses the existing works in the multi-domain meta-learning benchmarks and opens up new chances of meta-learning for obtaining the diverse inductive bias that can be combined to stretch toward the effective initialization for diverse unseen tasks. } }
Endnote
%0 Conference Paper %T XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage %A Jae-Jun Lee %A Sung Whan Yoon %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-lee24b %I PMLR %P 3196--3204 %U https://proceedings.mlr.press/v238/lee24b.html %V 238 %X Meta-learning, which pursues an effective initialization model, has emerged as a promising approach to handling unseen tasks. However, a limitation remains to be evident when a meta-learner tries to encompass a wide range of task distribution, e.g., learning across distinctive datasets or domains. Recently, a group of works has attempted to employ multiple model initializations to cover widely-ranging tasks, but they are limited in adaptively expanding initializations. We introduce XB-MAML, which learns expandable basis parameters, where they are linearly combined to form an effective initialization to a given task. XB-MAML observes the discrepancy between the vector space spanned by the basis and fine-tuned parameters to decide whether to expand the basis. Our method surpasses the existing works in the multi-domain meta-learning benchmarks and opens up new chances of meta-learning for obtaining the diverse inductive bias that can be combined to stretch toward the effective initialization for diverse unseen tasks.
APA
Lee, J. & Whan Yoon, S.. (2024). XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3196-3204 Available from https://proceedings.mlr.press/v238/lee24b.html.

Related Material