Hyperbolic Diffusion Embedding and Distance for Hierarchical Representation Learning

Ya-Wei Eileen Lin, Ronald R. Coifman, Gal Mishne, Ronen Talmon
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:21003-21025, 2023.

Abstract

Finding meaningful representations and distances of hierarchical data is important in many fields. This paper presents a new method for hierarchical data embedding and distance. Our method relies on combining diffusion geometry, a central approach to manifold learning, and hyperbolic geometry. Specifically, using diffusion geometry, we build multi-scale densities on the data, aimed to reveal their hierarchical structure, and then embed them into a product of hyperbolic spaces. We show theoretically that our embedding and distance recover the underlying hierarchical structure. In addition, we demonstrate the efficacy of the proposed method and its advantages compared to existing methods on graph embedding benchmarks and hierarchical datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-lin23b, title = {Hyperbolic Diffusion Embedding and Distance for Hierarchical Representation Learning}, author = {Lin, Ya-Wei Eileen and Coifman, Ronald R. and Mishne, Gal and Talmon, Ronen}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {21003--21025}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/lin23b/lin23b.pdf}, url = {https://proceedings.mlr.press/v202/lin23b.html}, abstract = {Finding meaningful representations and distances of hierarchical data is important in many fields. This paper presents a new method for hierarchical data embedding and distance. Our method relies on combining diffusion geometry, a central approach to manifold learning, and hyperbolic geometry. Specifically, using diffusion geometry, we build multi-scale densities on the data, aimed to reveal their hierarchical structure, and then embed them into a product of hyperbolic spaces. We show theoretically that our embedding and distance recover the underlying hierarchical structure. In addition, we demonstrate the efficacy of the proposed method and its advantages compared to existing methods on graph embedding benchmarks and hierarchical datasets.} }
Endnote
%0 Conference Paper %T Hyperbolic Diffusion Embedding and Distance for Hierarchical Representation Learning %A Ya-Wei Eileen Lin %A Ronald R. Coifman %A Gal Mishne %A Ronen Talmon %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-lin23b %I PMLR %P 21003--21025 %U https://proceedings.mlr.press/v202/lin23b.html %V 202 %X Finding meaningful representations and distances of hierarchical data is important in many fields. This paper presents a new method for hierarchical data embedding and distance. Our method relies on combining diffusion geometry, a central approach to manifold learning, and hyperbolic geometry. Specifically, using diffusion geometry, we build multi-scale densities on the data, aimed to reveal their hierarchical structure, and then embed them into a product of hyperbolic spaces. We show theoretically that our embedding and distance recover the underlying hierarchical structure. In addition, we demonstrate the efficacy of the proposed method and its advantages compared to existing methods on graph embedding benchmarks and hierarchical datasets.
APA
Lin, Y.E., Coifman, R.R., Mishne, G. & Talmon, R.. (2023). Hyperbolic Diffusion Embedding and Distance for Hierarchical Representation Learning. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:21003-21025 Available from https://proceedings.mlr.press/v202/lin23b.html.

Related Material