Weisfeiler and Leman Go Gambling: Why Expressive Lottery Tickets Win

Lorenz Kummer, Samir Moustafa, Anatol Ehrlich, Franka Bause, Nikolaus Suess, Wilfried N. Gansterer, Nils Morten Kriege
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:31991-32010, 2025.

Abstract

The lottery ticket hypothesis (LTH) is well-studied for convolutional neural networks but has been validated only empirically for graph neural networks (GNNs), for which theoretical findings are largely lacking. In this paper, we identify the expressivity of sparse subnetworks, i.e. their ability to distinguish non-isomorphic graphs, as crucial for finding winning tickets that preserve the predictive performance. We establish conditions under which the expressivity of a sparsely initialized GNN matches that of the full network, particularly when compared to the Weisfeiler-Leman test, and in that context put forward and prove a Strong Expressive Lottery Ticket Hypothesis. We subsequently show that an increased expressivity in the initialization potentially accelerates model convergence and improves generalization. Our findings establish novel theoretical foundations for both LTH and GNN research, highlighting the importance of maintaining expressivity in sparsely initialized GNNs. We illustrate our results using examples from drug discovery.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-kummer25a, title = {Weisfeiler and Leman Go Gambling: Why Expressive Lottery Tickets Win}, author = {Kummer, Lorenz and Moustafa, Samir and Ehrlich, Anatol and Bause, Franka and Suess, Nikolaus and Gansterer, Wilfried N. and Kriege, Nils Morten}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {31991--32010}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/kummer25a/kummer25a.pdf}, url = {https://proceedings.mlr.press/v267/kummer25a.html}, abstract = {The lottery ticket hypothesis (LTH) is well-studied for convolutional neural networks but has been validated only empirically for graph neural networks (GNNs), for which theoretical findings are largely lacking. In this paper, we identify the expressivity of sparse subnetworks, i.e. their ability to distinguish non-isomorphic graphs, as crucial for finding winning tickets that preserve the predictive performance. We establish conditions under which the expressivity of a sparsely initialized GNN matches that of the full network, particularly when compared to the Weisfeiler-Leman test, and in that context put forward and prove a Strong Expressive Lottery Ticket Hypothesis. We subsequently show that an increased expressivity in the initialization potentially accelerates model convergence and improves generalization. Our findings establish novel theoretical foundations for both LTH and GNN research, highlighting the importance of maintaining expressivity in sparsely initialized GNNs. We illustrate our results using examples from drug discovery.} }
Endnote
%0 Conference Paper %T Weisfeiler and Leman Go Gambling: Why Expressive Lottery Tickets Win %A Lorenz Kummer %A Samir Moustafa %A Anatol Ehrlich %A Franka Bause %A Nikolaus Suess %A Wilfried N. Gansterer %A Nils Morten Kriege %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-kummer25a %I PMLR %P 31991--32010 %U https://proceedings.mlr.press/v267/kummer25a.html %V 267 %X The lottery ticket hypothesis (LTH) is well-studied for convolutional neural networks but has been validated only empirically for graph neural networks (GNNs), for which theoretical findings are largely lacking. In this paper, we identify the expressivity of sparse subnetworks, i.e. their ability to distinguish non-isomorphic graphs, as crucial for finding winning tickets that preserve the predictive performance. We establish conditions under which the expressivity of a sparsely initialized GNN matches that of the full network, particularly when compared to the Weisfeiler-Leman test, and in that context put forward and prove a Strong Expressive Lottery Ticket Hypothesis. We subsequently show that an increased expressivity in the initialization potentially accelerates model convergence and improves generalization. Our findings establish novel theoretical foundations for both LTH and GNN research, highlighting the importance of maintaining expressivity in sparsely initialized GNNs. We illustrate our results using examples from drug discovery.
APA
Kummer, L., Moustafa, S., Ehrlich, A., Bause, F., Suess, N., Gansterer, W.N. & Kriege, N.M.. (2025). Weisfeiler and Leman Go Gambling: Why Expressive Lottery Tickets Win. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:31991-32010 Available from https://proceedings.mlr.press/v267/kummer25a.html.

Related Material