| 注册
首页|期刊导航|海南师范大学学报(自然科学版)|基于自注意力机制的BERT文本情感分析模型

基于自注意力机制的BERT文本情感分析模型

朱珍元 苏喻

海南师范大学学报(自然科学版)2025,Vol.38Issue(3):281-288,8.
海南师范大学学报(自然科学版)2025,Vol.38Issue(3):281-288,8.DOI:10.12051/j.issn.1674-4942.2025.03.004

基于自注意力机制的BERT文本情感分析模型

BERT Text Sentiment Analysis Model Based on Self-attention Mechanism

朱珍元 1苏喻2

作者信息

  • 1. 安徽警官职业学院 信息管理系,安徽 合肥 230031
  • 2. 合肥师范学院 计算机学院,安徽 合肥 230601||合肥综合性国家科学中心 人工智能研究院,安徽 合肥 230088
  • 折叠

摘要

Abstract

In the field of text sentiment analysis,BERT model is widely used because of its powerful feature extraction ability.However,empirical research shows that without fine-tuning BERT,its accuracy may suffer significant loss,result-ing in the actual effect of the model failing to meet expectations.In order to solve this problem,a BERT text emotion analy-sis model combining self-attention is proposed:BERT-BLSTM-Attention.The model makes comprehensive use of BERT's pre-training ability,BLSTM and self-attention mechanism to enhance the understanding and analysis of text emotion.First,the BERT model is used to represent the input text data as high-dimensional feature vectors.BERT,as a powerful pre-trained model,can capture rich semantic information and contextual features to provide basic input for subsequent models.At this stage,BERT's bidirectional coding capability allows the model to extract more detailed semantic informa-tion from the context,which is crucial for sentiment analysis.Then,after the BLSTM layer,the multi-head self-attention mechanism is introduced.With the addition of self-attention mechanism,the model can pay more attention to the important parts of the text when processing the input sequence,and strengthen the role of these key features by dynamically assigning weights.Finally,the model uses SoftMax function in the output layer for text sentiment classification.At this stage,based on the collected features,the model is able to generate a probability distribution for each emotion category,providing an out-put for emotion classification.In addition to effective classification,the model also shows excellent generalization ability.The experimental results show that the accuracy of the BLSTM model with self-attention mechanism is 1.8%higher than that without BLSTM,and 0.9%higher than that without BERT model,which fully demonstrates the effectiveness of the pro-posed model in language feature extraction.

关键词

BERT模型/文本情感分析/自注意力机制

Key words

BERT model/text sentiment analysis/self-attention mechanism

分类

自科综合

引用本文复制引用

朱珍元,苏喻..基于自注意力机制的BERT文本情感分析模型[J].海南师范大学学报(自然科学版),2025,38(3):281-288,8.

基金项目

安徽省高等学校自然科学研究重点项目(2022AH052939) (2022AH052939)

安徽警官职业学院教学研究重点项目(2022yjjyxm11) (2022yjjyxm11)

海南师范大学学报(自然科学版)

1674-4942

访问量0
|
下载量0
段落导航相关论文