Understanding More Knowledge Makes the Transformer Perform Better in Document-level Relation Extraction

Chen Haotian, Chen Yijiang, Zhou Xiangdong
Proceedings of the 15th Asian Conference on Machine Learning, PMLR 222:231-246, 2024.

Abstract

Relation extraction plays a vital role in knowledge graph construction. In contrast with the traditional relation extraction on a single sentence, extracting relations from multiple sentences as a whole will harvest more valuable and richer knowledge. Recently, the Transformer-based pre-trained language models (TPLMs) are widely adopted to tackle document-level relation extraction (DocRE). Graph-based methods, aiming to acquire knowledge between entities to form entity-level relation graphs, have facilitated the rapid development of DocRE by infusing their proposed models with the knowledge. However, beyond entity-level knowledge, we discover many other kinds of knowledge that can aid humans to extract relations. It remains unclear whether and in which way they can be adopted to improve the performance of the Transformer, which affects the maximum performance gain of Transformer-based methods. In this paper, we propose a novel weighted multi-channel Transformer (WMCT) to infuse unlimited kinds of knowledge into the vanilla Transformer. Based on WMCT, we also explore five kinds of knowledge to enhance both its reasoning ability and expressive power. Our extensive experimental results demonstrate that: (1) more knowledge makes the performance of the Transformer better and (2) more informative knowledge leads to more performance gain. We appeal to future Transformer-based work to consider exploring more informative knowledge to improve the performance of the Transformer.

Cite this Paper


BibTeX
@InProceedings{pmlr-v222-haotian24a, title = {Understanding More Knowledge Makes the Transformer Perform Better in Document-level Relation Extraction}, author = {Haotian, Chen and Yijiang, Chen and Xiangdong, Zhou}, booktitle = {Proceedings of the 15th Asian Conference on Machine Learning}, pages = {231--246}, year = {2024}, editor = {Yanıkoğlu, Berrin and Buntine, Wray}, volume = {222}, series = {Proceedings of Machine Learning Research}, month = {11--14 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v222/haotian24a/haotian24a.pdf}, url = {https://proceedings.mlr.press/v222/haotian24a.html}, abstract = {Relation extraction plays a vital role in knowledge graph construction. In contrast with the traditional relation extraction on a single sentence, extracting relations from multiple sentences as a whole will harvest more valuable and richer knowledge. Recently, the Transformer-based pre-trained language models (TPLMs) are widely adopted to tackle document-level relation extraction (DocRE). Graph-based methods, aiming to acquire knowledge between entities to form entity-level relation graphs, have facilitated the rapid development of DocRE by infusing their proposed models with the knowledge. However, beyond entity-level knowledge, we discover many other kinds of knowledge that can aid humans to extract relations. It remains unclear whether and in which way they can be adopted to improve the performance of the Transformer, which affects the maximum performance gain of Transformer-based methods. In this paper, we propose a novel weighted multi-channel Transformer (WMCT) to infuse unlimited kinds of knowledge into the vanilla Transformer. Based on WMCT, we also explore five kinds of knowledge to enhance both its reasoning ability and expressive power. Our extensive experimental results demonstrate that: (1) more knowledge makes the performance of the Transformer better and (2) more informative knowledge leads to more performance gain. We appeal to future Transformer-based work to consider exploring more informative knowledge to improve the performance of the Transformer.} }
Endnote
%0 Conference Paper %T Understanding More Knowledge Makes the Transformer Perform Better in Document-level Relation Extraction %A Chen Haotian %A Chen Yijiang %A Zhou Xiangdong %B Proceedings of the 15th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Berrin Yanıkoğlu %E Wray Buntine %F pmlr-v222-haotian24a %I PMLR %P 231--246 %U https://proceedings.mlr.press/v222/haotian24a.html %V 222 %X Relation extraction plays a vital role in knowledge graph construction. In contrast with the traditional relation extraction on a single sentence, extracting relations from multiple sentences as a whole will harvest more valuable and richer knowledge. Recently, the Transformer-based pre-trained language models (TPLMs) are widely adopted to tackle document-level relation extraction (DocRE). Graph-based methods, aiming to acquire knowledge between entities to form entity-level relation graphs, have facilitated the rapid development of DocRE by infusing their proposed models with the knowledge. However, beyond entity-level knowledge, we discover many other kinds of knowledge that can aid humans to extract relations. It remains unclear whether and in which way they can be adopted to improve the performance of the Transformer, which affects the maximum performance gain of Transformer-based methods. In this paper, we propose a novel weighted multi-channel Transformer (WMCT) to infuse unlimited kinds of knowledge into the vanilla Transformer. Based on WMCT, we also explore five kinds of knowledge to enhance both its reasoning ability and expressive power. Our extensive experimental results demonstrate that: (1) more knowledge makes the performance of the Transformer better and (2) more informative knowledge leads to more performance gain. We appeal to future Transformer-based work to consider exploring more informative knowledge to improve the performance of the Transformer.
APA
Haotian, C., Yijiang, C. & Xiangdong, Z.. (2024). Understanding More Knowledge Makes the Transformer Perform Better in Document-level Relation Extraction. Proceedings of the 15th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 222:231-246 Available from https://proceedings.mlr.press/v222/haotian24a.html.

Related Material