Lessons learned from the NeurIPS 2021 MetaDL challenge: Backbone fine-tuning without episodic meta-learning dominates for few-shot learning image classification

Adrian El Baz, Ihsan Ullah, Edesio Alcobaça, André C. P. L. F. Carvalho, Hong Chen, Fabio Ferreira, Henry Gouk, Chaoyu Guan, Isabelle Guyon, Timothy Hospedales, Shell Hu, Mike Huisman, Frank Hutter, Zhengying Liu, Felix Mohr, Ekrem Öztürk, Jan N. van Rijn, Haozhe Sun, Xin Wang, Wenwu Zhu
Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track, PMLR 176:80-96, 2022.

Abstract

Although deep neural networks are capable of achieving performance superior to humans on various tasks, they are notorious for requiring large amounts of data and computing resources, restricting their success to domains where such resources are available. Meta-learning methods can address this problem by transferring knowledge from related tasks, thus reducing the amount of data and computing resources needed to learn new tasks. We organize the MetaDL competition series, which provide opportunities for research groups all over the world to create and experimentally assess new meta-(deep)learning solutions for real problems. In this paper, authored collaboratively between the competition organizers and the top-ranked participants, we describe the design of the competition, the datasets, the best experimental results, as well as the top-ranked methods in the NeurIPS 2021 challenge, which attracted 15 active teams who made it to the final phase (by outperforming the baseline), making over 100 code submissions during the feedback phase. The solutions of the top participants have been open-sourced. The lessons learned include that learning good representations is essential for effective transfer learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v176-el-baz22a, title = {Lessons learned from the NeurIPS 2021 MetaDL challenge: Backbone fine-tuning without episodic meta-learning dominates for few-shot learning image classification}, author = {El Baz, Adrian and Ullah, Ihsan and Alcoba\c{c}a, Edesio and Carvalho, Andr\'{e} C. P. L. F. and Chen, Hong and Ferreira, Fabio and Gouk, Henry and Guan, Chaoyu and Guyon, Isabelle and Hospedales, Timothy and Hu, Shell and Huisman, Mike and Hutter, Frank and Liu, Zhengying and Mohr, Felix and \"Ozt\"urk, Ekrem and van Rijn, Jan N. and Sun, Haozhe and Wang, Xin and Zhu, Wenwu}, booktitle = {Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track}, pages = {80--96}, year = {2022}, editor = {Kiela, Douwe and Ciccone, Marco and Caputo, Barbara}, volume = {176}, series = {Proceedings of Machine Learning Research}, month = {06--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v176/el-baz22a/el-baz22a.pdf}, url = {https://proceedings.mlr.press/v176/el-baz22a.html}, abstract = {Although deep neural networks are capable of achieving performance superior to humans on various tasks, they are notorious for requiring large amounts of data and computing resources, restricting their success to domains where such resources are available. Meta-learning methods can address this problem by transferring knowledge from related tasks, thus reducing the amount of data and computing resources needed to learn new tasks. We organize the MetaDL competition series, which provide opportunities for research groups all over the world to create and experimentally assess new meta-(deep)learning solutions for real problems. In this paper, authored collaboratively between the competition organizers and the top-ranked participants, we describe the design of the competition, the datasets, the best experimental results, as well as the top-ranked methods in the NeurIPS 2021 challenge, which attracted 15 active teams who made it to the final phase (by outperforming the baseline), making over 100 code submissions during the feedback phase. The solutions of the top participants have been open-sourced. The lessons learned include that learning good representations is essential for effective transfer learning.} }
Endnote
%0 Conference Paper %T Lessons learned from the NeurIPS 2021 MetaDL challenge: Backbone fine-tuning without episodic meta-learning dominates for few-shot learning image classification %A Adrian El Baz %A Ihsan Ullah %A Edesio Alcobaça %A André C. P. L. F. Carvalho %A Hong Chen %A Fabio Ferreira %A Henry Gouk %A Chaoyu Guan %A Isabelle Guyon %A Timothy Hospedales %A Shell Hu %A Mike Huisman %A Frank Hutter %A Zhengying Liu %A Felix Mohr %A Ekrem Öztürk %A Jan N. van Rijn %A Haozhe Sun %A Xin Wang %A Wenwu Zhu %B Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track %C Proceedings of Machine Learning Research %D 2022 %E Douwe Kiela %E Marco Ciccone %E Barbara Caputo %F pmlr-v176-el-baz22a %I PMLR %P 80--96 %U https://proceedings.mlr.press/v176/el-baz22a.html %V 176 %X Although deep neural networks are capable of achieving performance superior to humans on various tasks, they are notorious for requiring large amounts of data and computing resources, restricting their success to domains where such resources are available. Meta-learning methods can address this problem by transferring knowledge from related tasks, thus reducing the amount of data and computing resources needed to learn new tasks. We organize the MetaDL competition series, which provide opportunities for research groups all over the world to create and experimentally assess new meta-(deep)learning solutions for real problems. In this paper, authored collaboratively between the competition organizers and the top-ranked participants, we describe the design of the competition, the datasets, the best experimental results, as well as the top-ranked methods in the NeurIPS 2021 challenge, which attracted 15 active teams who made it to the final phase (by outperforming the baseline), making over 100 code submissions during the feedback phase. The solutions of the top participants have been open-sourced. The lessons learned include that learning good representations is essential for effective transfer learning.
APA
El Baz, A., Ullah, I., Alcobaça, E., Carvalho, A.C.P.L.F., Chen, H., Ferreira, F., Gouk, H., Guan, C., Guyon, I., Hospedales, T., Hu, S., Huisman, M., Hutter, F., Liu, Z., Mohr, F., Öztürk, E., van Rijn, J.N., Sun, H., Wang, X. & Zhu, W.. (2022). Lessons learned from the NeurIPS 2021 MetaDL challenge: Backbone fine-tuning without episodic meta-learning dominates for few-shot learning image classification. Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track, in Proceedings of Machine Learning Research 176:80-96 Available from https://proceedings.mlr.press/v176/el-baz22a.html.

Related Material