Self-trained Centroid Classifiers for Semi-supervised Cross-domain Few-shot Learning

Hongyu Wang, Eibe Frank, Bernhard Pfahringer, Geoffrey Holmes
Proceedings of The 2nd Conference on Lifelong Learning Agents, PMLR 232:481-492, 2023.

Abstract

State-of-the-art cross-domain few-shot learning methods for image classification apply knowledge transfer by fine-tuning deep feature extractors obtained from source domains on the small labelled dataset available for the target domain, generally in conjunction with a simple centroid-based classification head. Semi-supervised learning during the meta-test phase is an obvious approach to incorporating unlabelled data into cross-domain few-shot learning, but semi-supervised methods designed for larger sets of labelled data than those available in few-shot learning appear to easily go astray when applied in this setting. We propose an efficient semi-supervised learning method that applies self-training to the classification head only and show that it can yield very consistent improvements in average performance in the Meta-Dataset benchmark for cross-domain few-shot learning when applied with contemporary methods utilising centroid-based classification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v232-wang23a, title = {Self-trained Centroid Classifiers for Semi-supervised Cross-domain Few-shot Learning}, author = {Wang, Hongyu and Frank, Eibe and Pfahringer, Bernhard and Holmes, Geoffrey}, booktitle = {Proceedings of The 2nd Conference on Lifelong Learning Agents}, pages = {481--492}, year = {2023}, editor = {Chandar, Sarath and Pascanu, Razvan and Sedghi, Hanie and Precup, Doina}, volume = {232}, series = {Proceedings of Machine Learning Research}, month = {22--25 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v232/wang23a/wang23a.pdf}, url = {https://proceedings.mlr.press/v232/wang23a.html}, abstract = {State-of-the-art cross-domain few-shot learning methods for image classification apply knowledge transfer by fine-tuning deep feature extractors obtained from source domains on the small labelled dataset available for the target domain, generally in conjunction with a simple centroid-based classification head. Semi-supervised learning during the meta-test phase is an obvious approach to incorporating unlabelled data into cross-domain few-shot learning, but semi-supervised methods designed for larger sets of labelled data than those available in few-shot learning appear to easily go astray when applied in this setting. We propose an efficient semi-supervised learning method that applies self-training to the classification head only and show that it can yield very consistent improvements in average performance in the Meta-Dataset benchmark for cross-domain few-shot learning when applied with contemporary methods utilising centroid-based classification.} }
Endnote
%0 Conference Paper %T Self-trained Centroid Classifiers for Semi-supervised Cross-domain Few-shot Learning %A Hongyu Wang %A Eibe Frank %A Bernhard Pfahringer %A Geoffrey Holmes %B Proceedings of The 2nd Conference on Lifelong Learning Agents %C Proceedings of Machine Learning Research %D 2023 %E Sarath Chandar %E Razvan Pascanu %E Hanie Sedghi %E Doina Precup %F pmlr-v232-wang23a %I PMLR %P 481--492 %U https://proceedings.mlr.press/v232/wang23a.html %V 232 %X State-of-the-art cross-domain few-shot learning methods for image classification apply knowledge transfer by fine-tuning deep feature extractors obtained from source domains on the small labelled dataset available for the target domain, generally in conjunction with a simple centroid-based classification head. Semi-supervised learning during the meta-test phase is an obvious approach to incorporating unlabelled data into cross-domain few-shot learning, but semi-supervised methods designed for larger sets of labelled data than those available in few-shot learning appear to easily go astray when applied in this setting. We propose an efficient semi-supervised learning method that applies self-training to the classification head only and show that it can yield very consistent improvements in average performance in the Meta-Dataset benchmark for cross-domain few-shot learning when applied with contemporary methods utilising centroid-based classification.
APA
Wang, H., Frank, E., Pfahringer, B. & Holmes, G.. (2023). Self-trained Centroid Classifiers for Semi-supervised Cross-domain Few-shot Learning. Proceedings of The 2nd Conference on Lifelong Learning Agents, in Proceedings of Machine Learning Research 232:481-492 Available from https://proceedings.mlr.press/v232/wang23a.html.

Related Material