计算机与数字工程2025,Vol.53Issue(2):474-479,6.DOI:10.3969/j.issn.1672-9722.2025.02.031
基于BERT-BiGRU-CNN的文本分类模型
Text Classification Model Based on BERT-BiGRU-CNN
摘要
Abstract
Absrtact Convolutional neural network(CNN)has some problems,such as the loss of structural information and the difficul-ty of recurrent neural network(RNN)in dealing with long-term dependent information.This paper proposes a BERT-BiGRU-CNN(BeGC)model,which uses the pre-training model BERT as the text embedding layer to better obtain the semantic information of the text.The context semantic features are obtained through the bidirectional gate recurrent unit(BiGRU),and then the important information of the text is obtained through the maximum pooling processing,so as to solve the limitations of CNN and RNN in text classification.Finally,a comparative experiment is carried out on the public data set of THUCNews.The results show that the perfor-mance of this model is better than most classification models,which proves the feasibility of the model.关键词
文本分类/语义特征/BERT/BiGRU/最大池化Key words
text classification/semantic features/BERT/BiGRU/maximum pooling分类
计算机与自动化引用本文复制引用
隋德义,祁云嵩..基于BERT-BiGRU-CNN的文本分类模型[J].计算机与数字工程,2025,53(2):474-479,6.基金项目
国家自然科学基金项目(编号:61471182)资助. (编号:61471182)