HousE: Knowledge Graph Embedding with Householder Parameterization

Rui Li, Jianan Zhao, Chaozhuo Li, Di He, Yiqi Wang, Yuming Liu, Hao Sun, Senzhang Wang, Weiwei Deng, Yanming Shen, Xing Xie, Qi Zhang
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:13209-13224, 2022.

Abstract

The effectiveness of knowledge graph embedding (KGE) largely depends on the ability to model intrinsic relation patterns and mapping properties. However, existing approaches can only capture some of them with insufficient modeling capacity. In this work, we propose a more powerful KGE framework named HousE, which involves a novel parameterization based on two kinds of Householder transformations: (1) Householder rotations to achieve superior capacity of modeling relation patterns; (2) Householder projections to handle sophisticated relation mapping properties. Theoretically, HousE is capable of modeling crucial relation patterns and mapping properties simultaneously. Besides, HousE is a generalization of existing rotation-based models while extending the rotations to high-dimensional spaces. Empirically, HousE achieves new state-of-the-art performance on five benchmark datasets. Our code is available at https://github.com/anrep/HousE.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-li22ab, title = {{H}ous{E}: Knowledge Graph Embedding with Householder Parameterization}, author = {Li, Rui and Zhao, Jianan and Li, Chaozhuo and He, Di and Wang, Yiqi and Liu, Yuming and Sun, Hao and Wang, Senzhang and Deng, Weiwei and Shen, Yanming and Xie, Xing and Zhang, Qi}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {13209--13224}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/li22ab/li22ab.pdf}, url = {https://proceedings.mlr.press/v162/li22ab.html}, abstract = {The effectiveness of knowledge graph embedding (KGE) largely depends on the ability to model intrinsic relation patterns and mapping properties. However, existing approaches can only capture some of them with insufficient modeling capacity. In this work, we propose a more powerful KGE framework named HousE, which involves a novel parameterization based on two kinds of Householder transformations: (1) Householder rotations to achieve superior capacity of modeling relation patterns; (2) Householder projections to handle sophisticated relation mapping properties. Theoretically, HousE is capable of modeling crucial relation patterns and mapping properties simultaneously. Besides, HousE is a generalization of existing rotation-based models while extending the rotations to high-dimensional spaces. Empirically, HousE achieves new state-of-the-art performance on five benchmark datasets. Our code is available at https://github.com/anrep/HousE.} }
Endnote
%0 Conference Paper %T HousE: Knowledge Graph Embedding with Householder Parameterization %A Rui Li %A Jianan Zhao %A Chaozhuo Li %A Di He %A Yiqi Wang %A Yuming Liu %A Hao Sun %A Senzhang Wang %A Weiwei Deng %A Yanming Shen %A Xing Xie %A Qi Zhang %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-li22ab %I PMLR %P 13209--13224 %U https://proceedings.mlr.press/v162/li22ab.html %V 162 %X The effectiveness of knowledge graph embedding (KGE) largely depends on the ability to model intrinsic relation patterns and mapping properties. However, existing approaches can only capture some of them with insufficient modeling capacity. In this work, we propose a more powerful KGE framework named HousE, which involves a novel parameterization based on two kinds of Householder transformations: (1) Householder rotations to achieve superior capacity of modeling relation patterns; (2) Householder projections to handle sophisticated relation mapping properties. Theoretically, HousE is capable of modeling crucial relation patterns and mapping properties simultaneously. Besides, HousE is a generalization of existing rotation-based models while extending the rotations to high-dimensional spaces. Empirically, HousE achieves new state-of-the-art performance on five benchmark datasets. Our code is available at https://github.com/anrep/HousE.
APA
Li, R., Zhao, J., Li, C., He, D., Wang, Y., Liu, Y., Sun, H., Wang, S., Deng, W., Shen, Y., Xie, X. & Zhang, Q.. (2022). HousE: Knowledge Graph Embedding with Householder Parameterization. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:13209-13224 Available from https://proceedings.mlr.press/v162/li22ab.html.

Related Material