Bidirectional Dependency-Guided Attention for Relation Extraction

Xingchen Deng, Lei Zhang, Yixing Fan, Long Bai, Jiafeng Guo, Pengfei Wang
Proceedings of The 12th Asian Conference on Machine Learning, PMLR 129:129-144, 2020.

Abstract

The dependency relation between words in the sentence is critical for the relation extraction. Existing methods often utilize the dependencies accompanied with various pruning strategies, thus suffer from the loss of detailed semantic information.In order to exploit dependency structure more effectively, we propose a novel bidirectional dependency-guided attention model. The main idea is to use a top-down attention as well as a bottom-up attention to fully capture the dependencies from different granularity. Specifically, the bottom-up attention aims to model the local semantics from the subtree of each node, while the top-down attention is to model the global semantics from the ancestor nodes. Moreover, we employ a label embedding component to attend the contextual features, which are extracted by the dependency-guided attention. Overall, the proposed model is fully attention-based which make it easy for parallel computing. Experiment results on TACRED dataset and SemEval 2010 Task 8 dataset show that our model outperforms existing dependency based models as well as the powerful pretraining model. Moreover, the proposed model achieves the state-of-the-art performance on TACRED dataset.

Cite this Paper


BibTeX
@InProceedings{pmlr-v129-deng20a, title = {Bidirectional Dependency-Guided Attention for Relation Extraction}, author = {Deng, Xingchen and Zhang, Lei and Fan, Yixing and Bai, Long and Guo, Jiafeng and Wang, Pengfei}, booktitle = {Proceedings of The 12th Asian Conference on Machine Learning}, pages = {129--144}, year = {2020}, editor = {Pan, Sinno Jialin and Sugiyama, Masashi}, volume = {129}, series = {Proceedings of Machine Learning Research}, month = {18--20 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v129/deng20a/deng20a.pdf}, url = {https://proceedings.mlr.press/v129/deng20a.html}, abstract = {The dependency relation between words in the sentence is critical for the relation extraction. Existing methods often utilize the dependencies accompanied with various pruning strategies, thus suffer from the loss of detailed semantic information.In order to exploit dependency structure more effectively, we propose a novel bidirectional dependency-guided attention model. The main idea is to use a top-down attention as well as a bottom-up attention to fully capture the dependencies from different granularity. Specifically, the bottom-up attention aims to model the local semantics from the subtree of each node, while the top-down attention is to model the global semantics from the ancestor nodes. Moreover, we employ a label embedding component to attend the contextual features, which are extracted by the dependency-guided attention. Overall, the proposed model is fully attention-based which make it easy for parallel computing. Experiment results on TACRED dataset and SemEval 2010 Task 8 dataset show that our model outperforms existing dependency based models as well as the powerful pretraining model. Moreover, the proposed model achieves the state-of-the-art performance on TACRED dataset. } }
Endnote
%0 Conference Paper %T Bidirectional Dependency-Guided Attention for Relation Extraction %A Xingchen Deng %A Lei Zhang %A Yixing Fan %A Long Bai %A Jiafeng Guo %A Pengfei Wang %B Proceedings of The 12th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Sinno Jialin Pan %E Masashi Sugiyama %F pmlr-v129-deng20a %I PMLR %P 129--144 %U https://proceedings.mlr.press/v129/deng20a.html %V 129 %X The dependency relation between words in the sentence is critical for the relation extraction. Existing methods often utilize the dependencies accompanied with various pruning strategies, thus suffer from the loss of detailed semantic information.In order to exploit dependency structure more effectively, we propose a novel bidirectional dependency-guided attention model. The main idea is to use a top-down attention as well as a bottom-up attention to fully capture the dependencies from different granularity. Specifically, the bottom-up attention aims to model the local semantics from the subtree of each node, while the top-down attention is to model the global semantics from the ancestor nodes. Moreover, we employ a label embedding component to attend the contextual features, which are extracted by the dependency-guided attention. Overall, the proposed model is fully attention-based which make it easy for parallel computing. Experiment results on TACRED dataset and SemEval 2010 Task 8 dataset show that our model outperforms existing dependency based models as well as the powerful pretraining model. Moreover, the proposed model achieves the state-of-the-art performance on TACRED dataset.
APA
Deng, X., Zhang, L., Fan, Y., Bai, L., Guo, J. & Wang, P.. (2020). Bidirectional Dependency-Guided Attention for Relation Extraction. Proceedings of The 12th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 129:129-144 Available from https://proceedings.mlr.press/v129/deng20a.html.

Related Material