Tilting the Odds at the Lottery: the Interplay of Overparameterisation and Curricula in Neural Networks

Stefano Sarao Mannelli, Yaraslau Ivashynka, Andrew M Saxe, Luca Saglietti
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:34586-34602, 2024.

Abstract

A wide range of empirical and theoretical works have shown that overparameterisation can amplify the performance of neural networks. According to the lottery ticket hypothesis, overparameterised networks have an increased chance of containing a sub-network that is well-initialised to solve the task at hand. A more parsimonious approach, inspired by animal learning, consists in guiding the learner towards solving the task by curating the order of the examples, ie. providing a curriculum. However, this learning strategy seems to be hardly beneficial in deep learning applications. In this work, we propose a theoretical analysis that connects curriculum learning and overparameterisation. In particular, we investigate their interplay in the online learning setting for a 2-layer network in the XOR-like Gaussian Mixture problem. Our results show that a high degree of overparameterisation—while simplifying the problem—can limit the benefit from curricula, providing a theoretical account of the ineffectiveness of curricula in deep learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-mannelli24a, title = {Tilting the Odds at the Lottery: the Interplay of Overparameterisation and Curricula in Neural Networks}, author = {Mannelli, Stefano Sarao and Ivashynka, Yaraslau and Saxe, Andrew M and Saglietti, Luca}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {34586--34602}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/mannelli24a/mannelli24a.pdf}, url = {https://proceedings.mlr.press/v235/mannelli24a.html}, abstract = {A wide range of empirical and theoretical works have shown that overparameterisation can amplify the performance of neural networks. According to the lottery ticket hypothesis, overparameterised networks have an increased chance of containing a sub-network that is well-initialised to solve the task at hand. A more parsimonious approach, inspired by animal learning, consists in guiding the learner towards solving the task by curating the order of the examples, ie. providing a curriculum. However, this learning strategy seems to be hardly beneficial in deep learning applications. In this work, we propose a theoretical analysis that connects curriculum learning and overparameterisation. In particular, we investigate their interplay in the online learning setting for a 2-layer network in the XOR-like Gaussian Mixture problem. Our results show that a high degree of overparameterisation—while simplifying the problem—can limit the benefit from curricula, providing a theoretical account of the ineffectiveness of curricula in deep learning.} }
Endnote
%0 Conference Paper %T Tilting the Odds at the Lottery: the Interplay of Overparameterisation and Curricula in Neural Networks %A Stefano Sarao Mannelli %A Yaraslau Ivashynka %A Andrew M Saxe %A Luca Saglietti %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-mannelli24a %I PMLR %P 34586--34602 %U https://proceedings.mlr.press/v235/mannelli24a.html %V 235 %X A wide range of empirical and theoretical works have shown that overparameterisation can amplify the performance of neural networks. According to the lottery ticket hypothesis, overparameterised networks have an increased chance of containing a sub-network that is well-initialised to solve the task at hand. A more parsimonious approach, inspired by animal learning, consists in guiding the learner towards solving the task by curating the order of the examples, ie. providing a curriculum. However, this learning strategy seems to be hardly beneficial in deep learning applications. In this work, we propose a theoretical analysis that connects curriculum learning and overparameterisation. In particular, we investigate their interplay in the online learning setting for a 2-layer network in the XOR-like Gaussian Mixture problem. Our results show that a high degree of overparameterisation—while simplifying the problem—can limit the benefit from curricula, providing a theoretical account of the ineffectiveness of curricula in deep learning.
APA
Mannelli, S.S., Ivashynka, Y., Saxe, A.M. & Saglietti, L.. (2024). Tilting the Odds at the Lottery: the Interplay of Overparameterisation and Curricula in Neural Networks. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:34586-34602 Available from https://proceedings.mlr.press/v235/mannelli24a.html.

Related Material