Semi-Supervised Few-Shot Learning with Prototypical Random Walks

Ahmed Ayyad, Yuchen Li, Raden Muaz, Shadi Albarqouni, Mohamed Elhoseiny
AAAI Workshop on Meta-Learning and MetaDL Challenge, PMLR 140:45-57, 2021.

Abstract

Recent progress has shown that few-shot learning can be improved with access to unlabelled data, known as semi-supervised few-shot learning(SS-FSL). We introduce an SS-FSL approach, dubbed as Prototypical Random Walk Networks(PRWN), built on top of Prototypical Networks (PN). We develop a random walk semi-supervised loss that enables the network to learn representations that are compact and well-separated. Our work is related to the very recent development of graph-based approaches for few-shot learning. However, we show that compact and well-separated class representations can be achieved by modeling our prototypical random walk notion without needing additional graph-NN parameters or requiring a transductive setting where a collective test set is provided. Our model outperforms baselines in most benchmarks with significant improvements in some cases. Our model, trained with 40$%$ of the data as labeled, compares competitively against fully supervised prototypical networks, trained on 100$%$ of the labels, even outperforming it in the 1-shot mini-Imagenet case with 50.89$%$ to 49.4$%$ accuracy. We also show that our loss is resistant to distractors, unlabeled data that does not belong to any of the training classes, and hence reflecting robustness to labeled/unlabeled class distribution mismatch.

Cite this Paper


BibTeX
@InProceedings{pmlr-v140-ayyad21a, title = {Semi-Supervised Few-Shot Learning with Prototypical Random Walks}, author = {Ayyad, Ahmed and Li, Yuchen and Muaz, Raden and Albarqouni, Shadi and Elhoseiny, Mohamed}, booktitle = {AAAI Workshop on Meta-Learning and MetaDL Challenge}, pages = {45--57}, year = {2021}, editor = {Guyon, Isabelle and van Rijn, Jan N. and Treguer, Sébastien and Vanschoren, Joaquin}, volume = {140}, series = {Proceedings of Machine Learning Research}, month = {09 Feb}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v140/ayyad21a/ayyad21a.pdf}, url = {https://proceedings.mlr.press/v140/ayyad21a.html}, abstract = {Recent progress has shown that few-shot learning can be improved with access to unlabelled data, known as semi-supervised few-shot learning(SS-FSL). We introduce an SS-FSL approach, dubbed as Prototypical Random Walk Networks(PRWN), built on top of Prototypical Networks (PN). We develop a random walk semi-supervised loss that enables the network to learn representations that are compact and well-separated. Our work is related to the very recent development of graph-based approaches for few-shot learning. However, we show that compact and well-separated class representations can be achieved by modeling our prototypical random walk notion without needing additional graph-NN parameters or requiring a transductive setting where a collective test set is provided. Our model outperforms baselines in most benchmarks with significant improvements in some cases. Our model, trained with 40$%$ of the data as labeled, compares competitively against fully supervised prototypical networks, trained on 100$%$ of the labels, even outperforming it in the 1-shot mini-Imagenet case with 50.89$%$ to 49.4$%$ accuracy. We also show that our loss is resistant to distractors, unlabeled data that does not belong to any of the training classes, and hence reflecting robustness to labeled/unlabeled class distribution mismatch.} }
Endnote
%0 Conference Paper %T Semi-Supervised Few-Shot Learning with Prototypical Random Walks %A Ahmed Ayyad %A Yuchen Li %A Raden Muaz %A Shadi Albarqouni %A Mohamed Elhoseiny %B AAAI Workshop on Meta-Learning and MetaDL Challenge %C Proceedings of Machine Learning Research %D 2021 %E Isabelle Guyon %E Jan N. van Rijn %E Sébastien Treguer %E Joaquin Vanschoren %F pmlr-v140-ayyad21a %I PMLR %P 45--57 %U https://proceedings.mlr.press/v140/ayyad21a.html %V 140 %X Recent progress has shown that few-shot learning can be improved with access to unlabelled data, known as semi-supervised few-shot learning(SS-FSL). We introduce an SS-FSL approach, dubbed as Prototypical Random Walk Networks(PRWN), built on top of Prototypical Networks (PN). We develop a random walk semi-supervised loss that enables the network to learn representations that are compact and well-separated. Our work is related to the very recent development of graph-based approaches for few-shot learning. However, we show that compact and well-separated class representations can be achieved by modeling our prototypical random walk notion without needing additional graph-NN parameters or requiring a transductive setting where a collective test set is provided. Our model outperforms baselines in most benchmarks with significant improvements in some cases. Our model, trained with 40$%$ of the data as labeled, compares competitively against fully supervised prototypical networks, trained on 100$%$ of the labels, even outperforming it in the 1-shot mini-Imagenet case with 50.89$%$ to 49.4$%$ accuracy. We also show that our loss is resistant to distractors, unlabeled data that does not belong to any of the training classes, and hence reflecting robustness to labeled/unlabeled class distribution mismatch.
APA
Ayyad, A., Li, Y., Muaz, R., Albarqouni, S. & Elhoseiny, M.. (2021). Semi-Supervised Few-Shot Learning with Prototypical Random Walks. AAAI Workshop on Meta-Learning and MetaDL Challenge, in Proceedings of Machine Learning Research 140:45-57 Available from https://proceedings.mlr.press/v140/ayyad21a.html.

Related Material