首页|期刊导航|海南师范大学学报(自然科学版)|基于自注意力机制的BERT文本情感分析模型

基于自注意力机制的BERT文本情感分析模型OA

BERT Text Sentiment Analysis Model Based on Self-attention Mechanism

中文摘要英文摘要

在文本情感分析领域,BERT模型因其强大的特征提取能力而被广泛应用.然而,实证研究表明,在没有对BERT进行微调的情况下,其准确性可能遭受显著损失,导致模型的实际效果未能达到预期.为了解决这一问题,提出一种结合自注意力的BERT文本情感分析模型:BERT-BLSTM-Attention.该模型通过综合利用BERT的预训练能力、BLSTM和自注意力机制,增强对文本情感的理解和分析.首先,BERT模型被用于将输入的文本数据表示为高维特征向量.BERT作为一种强大的预训练模型,能够捕捉到丰富的语义信息和上下文特征,为后续的模型提供基础输入.在这一阶段,BERT的双向编码能力使模型可以从上下文中提取出更多细腻的语义信息,这对于情感分析至关重要.然后,在BLSTM层之后引入多头自注意力机制.自注意力机制的加入,使得模型可以在处理输入序列时,更加关注文本中重要的部分,通过动态分配权重来强化这些关键特征的作用.最后,模型在输出层使用SoftMax函数进行文本情感分类.在这一阶段,基于收集到的特征,模型能够生成每种情感类别的概率分布,为情感分类提供输出.在进行有效分类的同时,模型也展示了出色的泛化能力.实验发现,引入自注意力机制的BLSTM模型的准确率比未引入自注意力机制的BLSTM模型高1.8%,比未使用BERT模型的准确率高0.9%,充分说明了本文模型在语言特征提取方面的有效性.

In the field of text sentiment analysis,BERT model is widely used because of its powerful feature extraction ability.However,empirical research shows that without fine-tuning BERT,its accuracy may suffer significant loss,result-ing in the actual effect of the model failing to meet expectations.In order to solve this problem,a BERT text emotion analy-sis model combining self-attention is proposed:BERT-BLSTM-Attention.The model makes comprehensive use of BERT's pre-training ability,BLSTM and self-attention mechanism to enhance the understanding and analysis of text emotion.First,the BERT model is used to represent the input text data as high-dimensional feature vectors.BERT,as a powerful pre-trained model,can capture rich semantic information and contextual features to provide basic input for subsequent models.At this stage,BERT's bidirectional coding capability allows the model to extract more detailed semantic informa-tion from the context,which is crucial for sentiment analysis.Then,after the BLSTM layer,the multi-head self-attention mechanism is introduced.With the addition of self-attention mechanism,the model can pay more attention to the important parts of the text when processing the input sequence,and strengthen the role of these key features by dynamically assigning weights.Finally,the model uses SoftMax function in the output layer for text sentiment classification.At this stage,based on the collected features,the model is able to generate a probability distribution for each emotion category,providing an out-put for emotion classification.In addition to effective classification,the model also shows excellent generalization ability.The experimental results show that the accuracy of the BLSTM model with self-attention mechanism is 1.8%higher than that without BLSTM,and 0.9%higher than that without BERT model,which fully demonstrates the effectiveness of the pro-posed model in language feature extraction.

朱珍元;苏喻

安徽警官职业学院 信息管理系,安徽 合肥 230031合肥师范学院 计算机学院,安徽 合肥 230601||合肥综合性国家科学中心 人工智能研究院,安徽 合肥 230088

自科综合

BERT模型文本情感分析自注意力机制

BERT modeltext sentiment analysisself-attention mechanism

《海南师范大学学报(自然科学版)》 2025 (3)

281-288,8

安徽省高等学校自然科学研究重点项目(2022AH052939)安徽警官职业学院教学研究重点项目(2022yjjyxm11)

10.12051/j.issn.1674-4942.2025.03.004

评论