| 注册
首页|期刊导航|现代信息科技|基于Transformer多头注意力机制的文本分类模型

基于Transformer多头注意力机制的文本分类模型

林云

现代信息科技2025,Vol.9Issue(6):100-104,5.
现代信息科技2025,Vol.9Issue(6):100-104,5.DOI:10.19850/j.cnki.2096-4706.2025.06.019

基于Transformer多头注意力机制的文本分类模型

Text Classification Model Based on Transformer Multi-Head Attention Mechanism

林云1

作者信息

  • 1. 厦门大学嘉庚学院,福建 漳州 363123
  • 折叠

摘要

Abstract

In the information age,automatic text classification is important for improving information retrieval efficiency.However,traditional classification models such as Naive Bayes,Support Vector Machines(SVM),and Logistic Regression,are deficiencies in handling long text and deep semantic understanding.Therefore,this study presents a text classification model based on Transformer.By utilizing Multi-Head Attention mechanism,this model effectively extracts key information and semantic relationships from the text,thus boosting text classification performance.Experiments demonstrate that this model consistently outperforms traditional CNN and LSTM across various datasets,with the highest accuracy of 94.62%,which verifies its effectiveness in text classification tasks.

关键词

机器学习/自然语言理解/多头注意力机制/文本分类/Transformer

Key words

Machine Learning/NLP/Multi-Head Attention mechanism/text classification/Transformer

分类

信息技术与安全科学

引用本文复制引用

林云..基于Transformer多头注意力机制的文本分类模型[J].现代信息科技,2025,9(6):100-104,5.

现代信息科技

2096-4706

访问量5
|
下载量0
段落导航相关论文