TopoTune: A Framework for Generalized Combinatorial Complex Neural Networks

Mathilde Papillon, Guillermo Bernardez, Claudio Battiloro, Nina Miolane
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:47924-47952, 2025.

Abstract

Graph Neural Networks (GNNs) effectively learn from relational data by leveraging graph symmetries. However, many real-world systems—such as biological or social networks—feature multi-way interactions that GNNs fail to capture. Topological Deep Learning (TDL) addresses this by modeling and leveraging higher-order structures, with Combinatorial Complex Neural Networks (CCNNs) offering a general and expressive approach that has been shown to outperform GNNs. However, TDL lacks the principled and standardized frameworks that underpin GNN development, restricting its accessibility and applicability. To address this issue, we introduce Generalized CCNNs (GCCNs), a simple yet powerful family of TDL models that can be used to systematically transform any (graph) neural network into its TDL counterpart. We prove that GCCNs generalize and subsume CCNNs, while extensive experiments on a diverse class of GCCNs show that these architectures consistently match or outperform CCNNs, often with less model complexity. In an effort to accelerate and democratize TDL, we introduce TopoTune, a lightweight software for defining, building, and training GCCNs with unprecedented flexibility and ease.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-papillon25a, title = {{T}opo{T}une: A Framework for Generalized Combinatorial Complex Neural Networks}, author = {Papillon, Mathilde and Bernardez, Guillermo and Battiloro, Claudio and Miolane, Nina}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {47924--47952}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/papillon25a/papillon25a.pdf}, url = {https://proceedings.mlr.press/v267/papillon25a.html}, abstract = {Graph Neural Networks (GNNs) effectively learn from relational data by leveraging graph symmetries. However, many real-world systems—such as biological or social networks—feature multi-way interactions that GNNs fail to capture. Topological Deep Learning (TDL) addresses this by modeling and leveraging higher-order structures, with Combinatorial Complex Neural Networks (CCNNs) offering a general and expressive approach that has been shown to outperform GNNs. However, TDL lacks the principled and standardized frameworks that underpin GNN development, restricting its accessibility and applicability. To address this issue, we introduce Generalized CCNNs (GCCNs), a simple yet powerful family of TDL models that can be used to systematically transform any (graph) neural network into its TDL counterpart. We prove that GCCNs generalize and subsume CCNNs, while extensive experiments on a diverse class of GCCNs show that these architectures consistently match or outperform CCNNs, often with less model complexity. In an effort to accelerate and democratize TDL, we introduce TopoTune, a lightweight software for defining, building, and training GCCNs with unprecedented flexibility and ease.} }
Endnote
%0 Conference Paper %T TopoTune: A Framework for Generalized Combinatorial Complex Neural Networks %A Mathilde Papillon %A Guillermo Bernardez %A Claudio Battiloro %A Nina Miolane %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-papillon25a %I PMLR %P 47924--47952 %U https://proceedings.mlr.press/v267/papillon25a.html %V 267 %X Graph Neural Networks (GNNs) effectively learn from relational data by leveraging graph symmetries. However, many real-world systems—such as biological or social networks—feature multi-way interactions that GNNs fail to capture. Topological Deep Learning (TDL) addresses this by modeling and leveraging higher-order structures, with Combinatorial Complex Neural Networks (CCNNs) offering a general and expressive approach that has been shown to outperform GNNs. However, TDL lacks the principled and standardized frameworks that underpin GNN development, restricting its accessibility and applicability. To address this issue, we introduce Generalized CCNNs (GCCNs), a simple yet powerful family of TDL models that can be used to systematically transform any (graph) neural network into its TDL counterpart. We prove that GCCNs generalize and subsume CCNNs, while extensive experiments on a diverse class of GCCNs show that these architectures consistently match or outperform CCNNs, often with less model complexity. In an effort to accelerate and democratize TDL, we introduce TopoTune, a lightweight software for defining, building, and training GCCNs with unprecedented flexibility and ease.
APA
Papillon, M., Bernardez, G., Battiloro, C. & Miolane, N.. (2025). TopoTune: A Framework for Generalized Combinatorial Complex Neural Networks. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:47924-47952 Available from https://proceedings.mlr.press/v267/papillon25a.html.

Related Material