Generalizing Knowledge Graph Embedding with Universal Orthogonal Parameterization

Rui Li, Chaozhuo Li, Yanming Shen, Zeyu Zhang, Xu Chen
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:28040-28059, 2024.

Abstract

Recent advances in knowledge graph embedding (KGE) rely on Euclidean/hyperbolic orthogonal relation transformations to model intrinsic logical patterns and topological structures. However, existing approaches are confined to rigid relational orthogonalization with restricted dimension and homogeneous geometry, leading to deficient modeling capability. In this work, we move beyond these approaches in terms of both dimension and geometry by introducing a powerful framework named GoldE, which features a universal orthogonal parameterization based on a generalized form of Householder reflection. Such parameterization can naturally achieve dimensional extension and geometric unification with theoretical guarantees, enabling our framework to simultaneously capture crucial logical patterns and inherent topological heterogeneity of knowledge graphs. Empirically, GoldE achieves state-of-the-art performance on three standard benchmarks. Codes are available at https://github.com/xxrep/GoldE.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-li24ah, title = {Generalizing Knowledge Graph Embedding with Universal Orthogonal Parameterization}, author = {Li, Rui and Li, Chaozhuo and Shen, Yanming and Zhang, Zeyu and Chen, Xu}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {28040--28059}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/li24ah/li24ah.pdf}, url = {https://proceedings.mlr.press/v235/li24ah.html}, abstract = {Recent advances in knowledge graph embedding (KGE) rely on Euclidean/hyperbolic orthogonal relation transformations to model intrinsic logical patterns and topological structures. However, existing approaches are confined to rigid relational orthogonalization with restricted dimension and homogeneous geometry, leading to deficient modeling capability. In this work, we move beyond these approaches in terms of both dimension and geometry by introducing a powerful framework named GoldE, which features a universal orthogonal parameterization based on a generalized form of Householder reflection. Such parameterization can naturally achieve dimensional extension and geometric unification with theoretical guarantees, enabling our framework to simultaneously capture crucial logical patterns and inherent topological heterogeneity of knowledge graphs. Empirically, GoldE achieves state-of-the-art performance on three standard benchmarks. Codes are available at https://github.com/xxrep/GoldE.} }
Endnote
%0 Conference Paper %T Generalizing Knowledge Graph Embedding with Universal Orthogonal Parameterization %A Rui Li %A Chaozhuo Li %A Yanming Shen %A Zeyu Zhang %A Xu Chen %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-li24ah %I PMLR %P 28040--28059 %U https://proceedings.mlr.press/v235/li24ah.html %V 235 %X Recent advances in knowledge graph embedding (KGE) rely on Euclidean/hyperbolic orthogonal relation transformations to model intrinsic logical patterns and topological structures. However, existing approaches are confined to rigid relational orthogonalization with restricted dimension and homogeneous geometry, leading to deficient modeling capability. In this work, we move beyond these approaches in terms of both dimension and geometry by introducing a powerful framework named GoldE, which features a universal orthogonal parameterization based on a generalized form of Householder reflection. Such parameterization can naturally achieve dimensional extension and geometric unification with theoretical guarantees, enabling our framework to simultaneously capture crucial logical patterns and inherent topological heterogeneity of knowledge graphs. Empirically, GoldE achieves state-of-the-art performance on three standard benchmarks. Codes are available at https://github.com/xxrep/GoldE.
APA
Li, R., Li, C., Shen, Y., Zhang, Z. & Chen, X.. (2024). Generalizing Knowledge Graph Embedding with Universal Orthogonal Parameterization. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:28040-28059 Available from https://proceedings.mlr.press/v235/li24ah.html.

Related Material