摘要
Abstract
In the information age,automatic text classification is important for improving information retrieval efficiency.However,traditional classification models such as Naive Bayes,Support Vector Machines(SVM),and Logistic Regression,are deficiencies in handling long text and deep semantic understanding.Therefore,this study presents a text classification model based on Transformer.By utilizing Multi-Head Attention mechanism,this model effectively extracts key information and semantic relationships from the text,thus boosting text classification performance.Experiments demonstrate that this model consistently outperforms traditional CNN and LSTM across various datasets,with the highest accuracy of 94.62%,which verifies its effectiveness in text classification tasks.关键词
机器学习/自然语言理解/多头注意力机制/文本分类/TransformerKey words
Machine Learning/NLP/Multi-Head Attention mechanism/text classification/Transformer分类
信息技术与安全科学