Unsupervised Parameter-free Simplicial Representation Learning with Scattering Transforms

Hiren Madhu, Sravanthi Gurugubelli, Sundeep Prabhakar Chepuri
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:34145-34160, 2024.

Abstract

Simplicial neural network models are becoming popular for processing and analyzing higher-order graph data, but they suffer from high training complexity and dependence on task-specific labels. To address these challenges, we propose simplicial scattering networks (SSNs), a parameter-free model inspired by scattering transforms designed to extract task-agnostic features from simplicial complex data without labels in a principled manner. Specifically, we propose a simplicial scattering transform based on random walk matrices for various adjacencies underlying a simplicial complex. We then use the simplicial scattering transform to construct a deep filter bank network that captures high-frequency information at multiple scales. The proposed simplicial scattering transform possesses properties such as permutation invariance, robustness to perturbations, and expressivity. We theoretically prove that including higher-order information improves the robustness of SSNs to perturbations. Empirical evaluations demonstrate that SSNs outperform existing simplicial or graph neural models in many tasks like node classification, simplicial closure, graph classification, trajectory prediction, and simplex prediction while being computationally efficient.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-madhu24a, title = {Unsupervised Parameter-free Simplicial Representation Learning with Scattering Transforms}, author = {Madhu, Hiren and Gurugubelli, Sravanthi and Chepuri, Sundeep Prabhakar}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {34145--34160}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/madhu24a/madhu24a.pdf}, url = {https://proceedings.mlr.press/v235/madhu24a.html}, abstract = {Simplicial neural network models are becoming popular for processing and analyzing higher-order graph data, but they suffer from high training complexity and dependence on task-specific labels. To address these challenges, we propose simplicial scattering networks (SSNs), a parameter-free model inspired by scattering transforms designed to extract task-agnostic features from simplicial complex data without labels in a principled manner. Specifically, we propose a simplicial scattering transform based on random walk matrices for various adjacencies underlying a simplicial complex. We then use the simplicial scattering transform to construct a deep filter bank network that captures high-frequency information at multiple scales. The proposed simplicial scattering transform possesses properties such as permutation invariance, robustness to perturbations, and expressivity. We theoretically prove that including higher-order information improves the robustness of SSNs to perturbations. Empirical evaluations demonstrate that SSNs outperform existing simplicial or graph neural models in many tasks like node classification, simplicial closure, graph classification, trajectory prediction, and simplex prediction while being computationally efficient.} }
Endnote
%0 Conference Paper %T Unsupervised Parameter-free Simplicial Representation Learning with Scattering Transforms %A Hiren Madhu %A Sravanthi Gurugubelli %A Sundeep Prabhakar Chepuri %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-madhu24a %I PMLR %P 34145--34160 %U https://proceedings.mlr.press/v235/madhu24a.html %V 235 %X Simplicial neural network models are becoming popular for processing and analyzing higher-order graph data, but they suffer from high training complexity and dependence on task-specific labels. To address these challenges, we propose simplicial scattering networks (SSNs), a parameter-free model inspired by scattering transforms designed to extract task-agnostic features from simplicial complex data without labels in a principled manner. Specifically, we propose a simplicial scattering transform based on random walk matrices for various adjacencies underlying a simplicial complex. We then use the simplicial scattering transform to construct a deep filter bank network that captures high-frequency information at multiple scales. The proposed simplicial scattering transform possesses properties such as permutation invariance, robustness to perturbations, and expressivity. We theoretically prove that including higher-order information improves the robustness of SSNs to perturbations. Empirical evaluations demonstrate that SSNs outperform existing simplicial or graph neural models in many tasks like node classification, simplicial closure, graph classification, trajectory prediction, and simplex prediction while being computationally efficient.
APA
Madhu, H., Gurugubelli, S. & Chepuri, S.P.. (2024). Unsupervised Parameter-free Simplicial Representation Learning with Scattering Transforms. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:34145-34160 Available from https://proceedings.mlr.press/v235/madhu24a.html.

Related Material