Representation Tradeoffs for Hyperbolic Embeddings

Frederic Sala, Chris De Sa, Albert Gu, Christopher Re
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4460-4469, 2018.

Abstract

Hyperbolic embeddings offer excellent quality with few dimensions when embedding hierarchical data structures. We give a combinatorial construction that embeds trees into hyperbolic space with arbitrarily low distortion without optimization. On WordNet, this algorithm obtains a mean-average-precision of 0.989 with only two dimensions, outperforming existing work by 0.11 points. We provide bounds characterizing the precision-dimensionality tradeoff inherent in any hyperbolic embedding. To embed general metric spaces, we propose a hyperbolic generalization of multidimensional scaling (h-MDS). We show how to perform exact recovery of hyperbolic points from distances, provide a perturbation analysis, and give a recovery result that enables us to reduce dimensionality. Finally, we extract lessons from the algorithms and theory above to design a scalable PyTorch-based implementation that can handle incomplete information.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-sala18a, title = {Representation Tradeoffs for Hyperbolic Embeddings}, author = {Sala, Frederic and De Sa, Chris and Gu, Albert and Re, Christopher}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4460--4469}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/sala18a/sala18a.pdf}, url = {http://proceedings.mlr.press/v80/sala18a.html}, abstract = {Hyperbolic embeddings offer excellent quality with few dimensions when embedding hierarchical data structures. We give a combinatorial construction that embeds trees into hyperbolic space with arbitrarily low distortion without optimization. On WordNet, this algorithm obtains a mean-average-precision of 0.989 with only two dimensions, outperforming existing work by 0.11 points. We provide bounds characterizing the precision-dimensionality tradeoff inherent in any hyperbolic embedding. To embed general metric spaces, we propose a hyperbolic generalization of multidimensional scaling (h-MDS). We show how to perform exact recovery of hyperbolic points from distances, provide a perturbation analysis, and give a recovery result that enables us to reduce dimensionality. Finally, we extract lessons from the algorithms and theory above to design a scalable PyTorch-based implementation that can handle incomplete information.} }
Endnote
%0 Conference Paper %T Representation Tradeoffs for Hyperbolic Embeddings %A Frederic Sala %A Chris De Sa %A Albert Gu %A Christopher Re %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-sala18a %I PMLR %P 4460--4469 %U http://proceedings.mlr.press/v80/sala18a.html %V 80 %X Hyperbolic embeddings offer excellent quality with few dimensions when embedding hierarchical data structures. We give a combinatorial construction that embeds trees into hyperbolic space with arbitrarily low distortion without optimization. On WordNet, this algorithm obtains a mean-average-precision of 0.989 with only two dimensions, outperforming existing work by 0.11 points. We provide bounds characterizing the precision-dimensionality tradeoff inherent in any hyperbolic embedding. To embed general metric spaces, we propose a hyperbolic generalization of multidimensional scaling (h-MDS). We show how to perform exact recovery of hyperbolic points from distances, provide a perturbation analysis, and give a recovery result that enables us to reduce dimensionality. Finally, we extract lessons from the algorithms and theory above to design a scalable PyTorch-based implementation that can handle incomplete information.
APA
Sala, F., De Sa, C., Gu, A. & Re, C.. (2018). Representation Tradeoffs for Hyperbolic Embeddings. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4460-4469 Available from http://proceedings.mlr.press/v80/sala18a.html.

Related Material