[edit]
Hybrid Neural Network Model for Extracting Character Relationships that Integrates Multi-level Information
Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, PMLR 245:120-128, 2024.
Abstract
In the realm of natural language processing, extracting entity relationships in Chinese holds paramount importance, serving as the bedrock for various downstream tasks. However, the complexity entrenched within Chinese linguistic structures and semantic nuances presents formidable challenges. To surmount these obstacles, our study proposes a sophisticated neural network model meticulously designed to integrate multi-level information for relationship extraction. By segmenting sentences into distinct components, entity-adjacent words and the sentence. The model capitalizes on the strengths of BERT pretrained models for dynamic word embedding. Furthermore, leveraging Bidirectional Gated Recurrent Units (BiGRU) and Convolutional Neural Networks (CNN) at the sentence level enables the capture of both sequential and structural features. At the entity-adjacent word level, a fusion of two fully connected neural networks extracts intricate associations between entities and neighboring words. Rigorous ablation and comparative experiments conducted on a comprehensive corpus underscore the efficacy of our proposed approach. Remarkably, compared to benchmark methods, our model exhibits a substantial 6.61% increase in recall rate and a note-worthy 5.31% improvement in the F1 value.