Hyperbolic Entailment Cones for Learning Hierarchical Embeddings

[edit]

Octavian Ganea, Gary Becigneul, Thomas Hofmann ;
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1646-1655, 2018.

Abstract

Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning. We here present a novel method to embed directed acyclic graphs. Following prior work, we first advocate for using hyperbolic spaces which provably model tree-like structures better than Euclidean geometry. Second, we view hierarchical relations as partial orders defined using a family of nested geodesically convex cones. We prove that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces, and they canonically define the embedding learning process. Experiments show significant improvements of our method over strong recent baselines both in terms of representational capacity and generalization.

Related Material