RTD-Lite: Scalable Topological Analysis for Comparing Weighted Graphs in Learning Tasks

Eduard Tulchinskii, Daria Voronkova, Ilya Trofimov, Evgeny Burnaev, Serguei Barannikov
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3826-3834, 2025.

Abstract

Topological methods for comparing weighted graphs are valuable in various learning tasks but often suffer from computational inefficiency on large datasets. We introduce RTD-Lite, a scalable algorithm that efficiently compares topological features, specifically connectivity or cluster structures at arbitrary scales, of two weighted graphs with one-to-one correspondence between vertices. By leveraging minimal spanning trees in auxiliary graphs, RTD-Lite captures topological discrepancies with $O(n^2)$ time and memory complexity. This efficiency enables its application in tasks like dimensionality reduction and neural network training. Experiments on synthetic and real-world datasets demonstrate that RTD-Lite effectively identifies topological differences while significantly reducing computation time compared to existing methods. Moreover, integrating RTD-Lite into neural network training as a loss function component enhances the preservation of topological structures in learned representations. Our code is publicly available at \url{https://github.com/ArGintum/RTD-Lite.}

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-tulchinskii25a, title = {RTD-Lite: Scalable Topological Analysis for Comparing Weighted Graphs in Learning Tasks}, author = {Tulchinskii, Eduard and Voronkova, Daria and Trofimov, Ilya and Burnaev, Evgeny and Barannikov, Serguei}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3826--3834}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/tulchinskii25a/tulchinskii25a.pdf}, url = {https://proceedings.mlr.press/v258/tulchinskii25a.html}, abstract = {Topological methods for comparing weighted graphs are valuable in various learning tasks but often suffer from computational inefficiency on large datasets. We introduce RTD-Lite, a scalable algorithm that efficiently compares topological features, specifically connectivity or cluster structures at arbitrary scales, of two weighted graphs with one-to-one correspondence between vertices. By leveraging minimal spanning trees in auxiliary graphs, RTD-Lite captures topological discrepancies with $O(n^2)$ time and memory complexity. This efficiency enables its application in tasks like dimensionality reduction and neural network training. Experiments on synthetic and real-world datasets demonstrate that RTD-Lite effectively identifies topological differences while significantly reducing computation time compared to existing methods. Moreover, integrating RTD-Lite into neural network training as a loss function component enhances the preservation of topological structures in learned representations. Our code is publicly available at \url{https://github.com/ArGintum/RTD-Lite.}} }
Endnote
%0 Conference Paper %T RTD-Lite: Scalable Topological Analysis for Comparing Weighted Graphs in Learning Tasks %A Eduard Tulchinskii %A Daria Voronkova %A Ilya Trofimov %A Evgeny Burnaev %A Serguei Barannikov %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-tulchinskii25a %I PMLR %P 3826--3834 %U https://proceedings.mlr.press/v258/tulchinskii25a.html %V 258 %X Topological methods for comparing weighted graphs are valuable in various learning tasks but often suffer from computational inefficiency on large datasets. We introduce RTD-Lite, a scalable algorithm that efficiently compares topological features, specifically connectivity or cluster structures at arbitrary scales, of two weighted graphs with one-to-one correspondence between vertices. By leveraging minimal spanning trees in auxiliary graphs, RTD-Lite captures topological discrepancies with $O(n^2)$ time and memory complexity. This efficiency enables its application in tasks like dimensionality reduction and neural network training. Experiments on synthetic and real-world datasets demonstrate that RTD-Lite effectively identifies topological differences while significantly reducing computation time compared to existing methods. Moreover, integrating RTD-Lite into neural network training as a loss function component enhances the preservation of topological structures in learned representations. Our code is publicly available at \url{https://github.com/ArGintum/RTD-Lite.}
APA
Tulchinskii, E., Voronkova, D., Trofimov, I., Burnaev, E. & Barannikov, S.. (2025). RTD-Lite: Scalable Topological Analysis for Comparing Weighted Graphs in Learning Tasks. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3826-3834 Available from https://proceedings.mlr.press/v258/tulchinskii25a.html.

Related Material