Hyperbolic Entailment Cones for Learning Hierarchical Embeddings

Octavian Ganea, Gary Becigneul, Thomas Hofmann
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1646-1655, 2018.

Abstract

Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning. We here present a novel method to embed directed acyclic graphs. Following prior work, we first advocate for using hyperbolic spaces which provably model tree-like structures better than Euclidean geometry. Second, we view hierarchical relations as partial orders defined using a family of nested geodesically convex cones. We prove that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces, and they canonically define the embedding learning process. Experiments show significant improvements of our method over strong recent baselines both in terms of representational capacity and generalization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-ganea18a, title = {Hyperbolic Entailment Cones for Learning Hierarchical Embeddings}, author = {Ganea, Octavian and Becigneul, Gary and Hofmann, Thomas}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {1646--1655}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/ganea18a/ganea18a.pdf}, url = {http://proceedings.mlr.press/v80/ganea18a.html}, abstract = {Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning. We here present a novel method to embed directed acyclic graphs. Following prior work, we first advocate for using hyperbolic spaces which provably model tree-like structures better than Euclidean geometry. Second, we view hierarchical relations as partial orders defined using a family of nested geodesically convex cones. We prove that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces, and they canonically define the embedding learning process. Experiments show significant improvements of our method over strong recent baselines both in terms of representational capacity and generalization.} }
Endnote
%0 Conference Paper %T Hyperbolic Entailment Cones for Learning Hierarchical Embeddings %A Octavian Ganea %A Gary Becigneul %A Thomas Hofmann %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-ganea18a %I PMLR %P 1646--1655 %U http://proceedings.mlr.press/v80/ganea18a.html %V 80 %X Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning. We here present a novel method to embed directed acyclic graphs. Following prior work, we first advocate for using hyperbolic spaces which provably model tree-like structures better than Euclidean geometry. Second, we view hierarchical relations as partial orders defined using a family of nested geodesically convex cones. We prove that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces, and they canonically define the embedding learning process. Experiments show significant improvements of our method over strong recent baselines both in terms of representational capacity and generalization.
APA
Ganea, O., Becigneul, G. & Hofmann, T.. (2018). Hyperbolic Entailment Cones for Learning Hierarchical Embeddings. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:1646-1655 Available from http://proceedings.mlr.press/v80/ganea18a.html.

Related Material