在线咨询
中国工业与应用数学学会会刊
主管:中华人民共和国教育部
主办:西安交通大学
ISSN 1005-3085  CN 61-1269/O1

工程数学学报

• • 上一篇    下一篇

基于全局语义信息的GR-BERT模型

王煜华1,   胡俊英1,   孙  凯2,   常培菊3,   费蓉蓉4   

  1. 1. 西北大学数学学院,西安  710127

    2. 西安交通大学数学与统计学院,西安 710049

    3. 北方民族大学数学与信息科学学院,银川 750021

    4. 陕西科技大学电子信息与人工智能学院,西安 710016
  • 收稿日期:2022-11-29 接受日期:2024-09-24 出版日期:2025-10-15 发布日期:2025-10-15
  • 通讯作者: 胡俊英 E-mail: hujunying@nwu.edu.cn
  • 基金资助:
    国家自然科学基金 (12001428);陕西省自然科学基础研究计划一般项目 (2024JC-YBQN-0037);陕西省教育厅专项科研计划项目 (23JK0347).

GR-BERT Model Based on Global Semantic Information

WANG Yuhua1,   HU Junying1,  SUN Kai2,  CHANG Peiju3,  FEI Rongrong4   

  1. 1. School of Mathematics, Northwest University, Xi'an 710127

    2. School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an 710049

    3. School of Mathematics and Information Science, North Minzu University, Yinchuan 750021

    4. School of Electronic Information and Artificial Intelligence, Shaanxi University of Science and Technology, Xi'an 710016
  • Received:2022-11-29 Accepted:2024-09-24 Online:2025-10-15 Published:2025-10-15
  • Contact: J. Hu. E-mail address: hujunying@nwu.edu.cn
  • Supported by:
    The National Natural Science Foundation of China (12001428); the General Project of the Natural Science Basic Research Plan in Shaanxi Province (2024JC-YBQN-0037); the Special Research Program of the Shaanxi Provincial Department of Education (23JK0347).

摘要:

关系抽取是提取实体间关系的一项重要的自然语言处理任务。最近的研究发现,预训练BERT模型在自然语言处理任务中取得了非常好的效果。此后,诞生了大量使用预训练BERT模型处理关系抽取任务的方法,其中具有代表性的是R-BERT方法。但是,该方法在实现时未考虑主语实体与宾语实体在语义上的差异,以及全局语义信息对关系抽取任务准确性的影响。通过设置两个不同的全连接层来分别提取主语实体和宾语实体的信息,从而将主语实体与宾语实体在语义上的差异引入模型的学习过程中。此外,还在原有的信息融合模块后面添加了一层带有激活函数的新全连接层来将高维全局语义信息与实体对充分融合。将融合了语义差异与全局语义信息的R-BERT简称为GR-BERT。通过在中文人物关系抽取数据集上进行实验,结果表明新提出的GR-BERT的效果较原始R-BERT取得了显著提升,从而验证了新方法GR-BERT的有效性。

关键词: BERT模型, 自然语言处理, 关系抽取, 神经网络

Abstract:

Relationship extraction is an important natural language processing task that involves extracting relationships between entities. Recent research has shown that the pre-trained BERT model achieves very good results in natural language processing tasks. Since then, many methods using pre-trained BERT model for relationship extraction have been developed, with R-BERT being a representative example. However, this method does not consider the semantic differences between the subject and object entities or the impact of global semantic information on the accuracy of relationship extraction tasks. This paper addresses this by setting up two separate fully connected layers to extract information for the subject and object entities, thereby incorporating the semantic differences between them into the model's learning process. Additionally, a new fully connected layer with an activation function is added after the existing information fusion module to fully integrate high-dimensional global semantic information with entity pairs. The proposed model, which integrates semantic differences and global semantic information, is referred to as GR-BERT. Experiments on a Chinese entity relationship extraction dataset show that GR-BERT significantly improves the original R-BERT, thereby validating its effectiveness.

Key words: BERT model, natural language processing, relation extraction, neural network

中图分类号: