Hybrid Neural Network Model for Extracting Character Relationships that Integrates Multi-level Information

Zhu Xichao, Chen Zhanbin
Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, PMLR 245:120-128, 2024.

Abstract

In the realm of natural language processing, extracting entity relationships in Chinese holds paramount importance, serving as the bedrock for various downstream tasks. However, the complexity entrenched within Chinese linguistic structures and semantic nuances presents formidable challenges. To surmount these obstacles, our study proposes a sophisticated neural network model meticulously designed to integrate multi-level information for relationship extraction. By segmenting sentences into distinct components, entity-adjacent words and the sentence. The model capitalizes on the strengths of BERT pretrained models for dynamic word embedding. Furthermore, leveraging Bidirectional Gated Recurrent Units (BiGRU) and Convolutional Neural Networks (CNN) at the sentence level enables the capture of both sequential and structural features. At the entity-adjacent word level, a fusion of two fully connected neural networks extracts intricate associations between entities and neighboring words. Rigorous ablation and comparative experiments conducted on a comprehensive corpus underscore the efficacy of our proposed approach. Remarkably, compared to benchmark methods, our model exhibits a substantial 6.61% increase in recall rate and a note-worthy 5.31% improvement in the F1 value.

Cite this Paper


BibTeX
@InProceedings{pmlr-v245-xichao24a, title = {Hybrid Neural Network Model for Extracting Character Relationships that Integrates Multi-level Information}, author = {Xichao, Zhu and Zhanbin, Chen}, booktitle = {Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing}, pages = {120--128}, year = {2024}, editor = {Nianyin, Zeng and Pachori, Ram Bilas}, volume = {245}, series = {Proceedings of Machine Learning Research}, month = {26--28 Apr}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v245/main/assets/xichao24a/xichao24a.pdf}, url = {https://proceedings.mlr.press/v245/xichao24a.html}, abstract = {In the realm of natural language processing, extracting entity relationships in Chinese holds paramount importance, serving as the bedrock for various downstream tasks. However, the complexity entrenched within Chinese linguistic structures and semantic nuances presents formidable challenges. To surmount these obstacles, our study proposes a sophisticated neural network model meticulously designed to integrate multi-level information for relationship extraction. By segmenting sentences into distinct components, entity-adjacent words and the sentence. The model capitalizes on the strengths of BERT pretrained models for dynamic word embedding. Furthermore, leveraging Bidirectional Gated Recurrent Units (BiGRU) and Convolutional Neural Networks (CNN) at the sentence level enables the capture of both sequential and structural features. At the entity-adjacent word level, a fusion of two fully connected neural networks extracts intricate associations between entities and neighboring words. Rigorous ablation and comparative experiments conducted on a comprehensive corpus underscore the efficacy of our proposed approach. Remarkably, compared to benchmark methods, our model exhibits a substantial 6.61% increase in recall rate and a note-worthy 5.31% improvement in the F1 value. } }
Endnote
%0 Conference Paper %T Hybrid Neural Network Model for Extracting Character Relationships that Integrates Multi-level Information %A Zhu Xichao %A Chen Zhanbin %B Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing %C Proceedings of Machine Learning Research %D 2024 %E Zeng Nianyin %E Ram Bilas Pachori %F pmlr-v245-xichao24a %I PMLR %P 120--128 %U https://proceedings.mlr.press/v245/xichao24a.html %V 245 %X In the realm of natural language processing, extracting entity relationships in Chinese holds paramount importance, serving as the bedrock for various downstream tasks. However, the complexity entrenched within Chinese linguistic structures and semantic nuances presents formidable challenges. To surmount these obstacles, our study proposes a sophisticated neural network model meticulously designed to integrate multi-level information for relationship extraction. By segmenting sentences into distinct components, entity-adjacent words and the sentence. The model capitalizes on the strengths of BERT pretrained models for dynamic word embedding. Furthermore, leveraging Bidirectional Gated Recurrent Units (BiGRU) and Convolutional Neural Networks (CNN) at the sentence level enables the capture of both sequential and structural features. At the entity-adjacent word level, a fusion of two fully connected neural networks extracts intricate associations between entities and neighboring words. Rigorous ablation and comparative experiments conducted on a comprehensive corpus underscore the efficacy of our proposed approach. Remarkably, compared to benchmark methods, our model exhibits a substantial 6.61% increase in recall rate and a note-worthy 5.31% improvement in the F1 value.
APA
Xichao, Z. & Zhanbin, C.. (2024). Hybrid Neural Network Model for Extracting Character Relationships that Integrates Multi-level Information. Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, in Proceedings of Machine Learning Research 245:120-128 Available from https://proceedings.mlr.press/v245/xichao24a.html.

Related Material