| 注册
首页|期刊导航|吉林大学学报(信息科学版)|融合SikuBERT模型与MHA的古汉语命名实体识别

融合SikuBERT模型与MHA的古汉语命名实体识别

陈雪松 詹子依 王浩畅

吉林大学学报(信息科学版)2023,Vol.41Issue(5):866-875,10.
吉林大学学报(信息科学版)2023,Vol.41Issue(5):866-875,10.

融合SikuBERT模型与MHA的古汉语命名实体识别

Ancient Chinese Named Entity Recognition Based on SikuBERT Model and MHA

陈雪松 1詹子依 1王浩畅2

作者信息

  • 1. 东北石油大学电气信息工程学院,黑龙江大庆 163318
  • 2. 东北石油大学计算机与信息技术学院,黑龙江大庆 163318
  • 折叠

摘要

Abstract

Aiming at the problem that the traditional named entity recognition method can not fully learn the complex sentence structure information of ancient Chinese and it is easy to cause information loss in the process of long sequence feature extraction,an ancient Chinese fusion of SikuBERT(Siku Bidirectional Encoder Representation from Transformers)model and MHA(Multi-Head Attention)is proposed.First,the SikuBERT model is used to pre-train the ancient Chinese corpus,the information vector obtained from the training into the BiLSTM(Bidirectional Long Short-Term Memory)network is input to extract features,and then the output features of the BiLSTM layer are assigned different weights through MHA to reduce the information loss problem of long sequences.And finally the predicted sequence labels are obtained through CRF(Conditional Random Field)decoding.Experiments show that compared with commonly used BiLSTM-CRF,BERT-BiLSTM-CRF and other models,the F1 value of this method has been significantly improved,which verifies that this method can effectively improve the effect of ancient Chinese named entity recognition.

关键词

古汉语/命名实体识别/SikuBERT模型/多头注意力机制

Key words

ancient Chinese/named entity recognition/siku bidirectional encoder representation from transformers(SikuBERT)model/multi-head attention mechanism

分类

信息技术与安全科学

引用本文复制引用

陈雪松,詹子依,王浩畅..融合SikuBERT模型与MHA的古汉语命名实体识别[J].吉林大学学报(信息科学版),2023,41(5):866-875,10.

基金项目

国家自然科学基金资助项目(61402099 ()

61702093) ()

吉林大学学报(信息科学版)

OACSTPCD

1671-5896

访问量0
|
下载量0
段落导航相关论文