Proving the Lottery Ticket Hypothesis: Pruning is All You Need

Eran Malach, Gilad Yehudai, Shai Shalev-Schwartz, Ohad Shamir
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6682-6691, 2020.

Abstract

The lottery ticket hypothesis (Frankle and Carbin, 2018), states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. We prove an even stronger hypothesis (as was also conjectured in Ramanujan et al., 2019), showing that for every bounded distribution and every target network with bounded weights, a sufficiently over-parameterized neural network with random weights contains a subnetwork with roughly the same accuracy as the target network, without any further training.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-malach20a, title = {Proving the Lottery Ticket Hypothesis: Pruning is All You Need}, author = {Malach, Eran and Yehudai, Gilad and Shalev-Schwartz, Shai and Shamir, Ohad}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6682--6691}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/malach20a/malach20a.pdf}, url = {http://proceedings.mlr.press/v119/malach20a.html}, abstract = {The lottery ticket hypothesis (Frankle and Carbin, 2018), states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. We prove an even stronger hypothesis (as was also conjectured in Ramanujan et al., 2019), showing that for every bounded distribution and every target network with bounded weights, a sufficiently over-parameterized neural network with random weights contains a subnetwork with roughly the same accuracy as the target network, without any further training.} }
Endnote
%0 Conference Paper %T Proving the Lottery Ticket Hypothesis: Pruning is All You Need %A Eran Malach %A Gilad Yehudai %A Shai Shalev-Schwartz %A Ohad Shamir %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-malach20a %I PMLR %P 6682--6691 %U http://proceedings.mlr.press/v119/malach20a.html %V 119 %X The lottery ticket hypothesis (Frankle and Carbin, 2018), states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. We prove an even stronger hypothesis (as was also conjectured in Ramanujan et al., 2019), showing that for every bounded distribution and every target network with bounded weights, a sufficiently over-parameterized neural network with random weights contains a subnetwork with roughly the same accuracy as the target network, without any further training.
APA
Malach, E., Yehudai, G., Shalev-Schwartz, S. & Shamir, O.. (2020). Proving the Lottery Ticket Hypothesis: Pruning is All You Need. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6682-6691 Available from http://proceedings.mlr.press/v119/malach20a.html.

Related Material