| 注册
首页|期刊导航|计算机与数字工程|基于Transformer模型的文本自动摘要生成

基于Transformer模型的文本自动摘要生成

刘志敏 张琨 朱浩华

计算机与数字工程2024,Vol.52Issue(2):482-486,527,6.
计算机与数字工程2024,Vol.52Issue(2):482-486,527,6.DOI:10.3969/j.issn.1672-9722.2024.02.034

基于Transformer模型的文本自动摘要生成

Automatic Text Summary Generation Based on Transformer Model

刘志敏 1张琨 1朱浩华1

作者信息

  • 1. 南京理工大学 南京 210094
  • 折叠

摘要

Abstract

This paper discusses the automatic generation technology of text summarization,whose task is to generate a concise summary which can express the main meaning of text.The traditional Seq2Seq structural model has limited ability to capture and store long-term features and global features,resulting in a lack of important information in the generated abstract.Therefore,this paper proposes a new abstractive summarization model called RC-Transformer-PGN(RCTP)based on the Transformer model.The model first uses an additional encoder based on bidirectional GRU to improve the Transformer model to capture sequential context representation and improve the ability to capture local information.Secondly,it introduces Pointer Generation Network and Cover-age mechanism to alleviate the problem of Out-Of-Vocabulary words and repeated words.The experimental results on CNN/Daily Mail dataset show that our proposed model is more effective than the baseline model.

关键词

生成式文本摘要/Transformer模型/指针生成网络/覆盖机制

Key words

abstractive summarization/Transformer model/pointer generator network/coverage mechanism

分类

建筑与水利

引用本文复制引用

刘志敏,张琨,朱浩华..基于Transformer模型的文本自动摘要生成[J].计算机与数字工程,2024,52(2):482-486,527,6.

计算机与数字工程

OACSTPCD

1672-9722

访问量0
|
下载量0
段落导航相关论文