[edit]
Self-trained Centroid Classifiers for Semi-supervised Cross-domain Few-shot Learning
Proceedings of The 2nd Conference on Lifelong Learning Agents, PMLR 232:481-492, 2023.
Abstract
State-of-the-art cross-domain few-shot learning methods for image classification apply knowledge transfer by fine-tuning deep feature extractors obtained from source domains on the small labelled dataset available for the target domain, generally in conjunction with a simple centroid-based classification head. Semi-supervised learning during the meta-test phase is an obvious approach to incorporating unlabelled data into cross-domain few-shot learning, but semi-supervised methods designed for larger sets of labelled data than those available in few-shot learning appear to easily go astray when applied in this setting. We propose an efficient semi-supervised learning method that applies self-training to the classification head only and show that it can yield very consistent improvements in average performance in the Meta-Dataset benchmark for cross-domain few-shot learning when applied with contemporary methods utilising centroid-based classification.