Association Journal of CSIAM
Supervised by Ministry of Education of PRC
Sponsored by Xi'an Jiaotong University
ISSN 1005-3085  CN 61-1269/O1

Chinese Journal of Engineering Mathematics

Previous Articles     Next Articles

GR-BERT Model Based on Global Semantic Information

WANG Yuhua1,   HU Junying1,  SUN Kai2,  CHANG Peiju3,  FEI Rongrong4   

  1. 1. School of Mathematics, Northwest University, Xi'an 710127

    2. School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an 710049

    3. School of Mathematics and Information Science, North Minzu University, Yinchuan 750021

    4. School of Electronic Information and Artificial Intelligence, Shaanxi University of Science and Technology, Xi'an 710016
  • Received:2022-11-29 Accepted:2024-09-24 Online:2025-10-15 Published:2025-10-15
  • Contact: J. Hu. E-mail address: hujunying@nwu.edu.cn
  • Supported by:
    The National Natural Science Foundation of China (12001428); the General Project of the Natural Science Basic Research Plan in Shaanxi Province (2024JC-YBQN-0037); the Special Research Program of the Shaanxi Provincial Department of Education (23JK0347).

Abstract:

Relationship extraction is an important natural language processing task that involves extracting relationships between entities. Recent research has shown that the pre-trained BERT model achieves very good results in natural language processing tasks. Since then, many methods using pre-trained BERT model for relationship extraction have been developed, with R-BERT being a representative example. However, this method does not consider the semantic differences between the subject and object entities or the impact of global semantic information on the accuracy of relationship extraction tasks. This paper addresses this by setting up two separate fully connected layers to extract information for the subject and object entities, thereby incorporating the semantic differences between them into the model's learning process. Additionally, a new fully connected layer with an activation function is added after the existing information fusion module to fully integrate high-dimensional global semantic information with entity pairs. The proposed model, which integrates semantic differences and global semantic information, is referred to as GR-BERT. Experiments on a Chinese entity relationship extraction dataset show that GR-BERT significantly improves the original R-BERT, thereby validating its effectiveness.

Key words: BERT model, natural language processing, relation extraction, neural network

CLC Number: