| 注册
首页|期刊导航|计算机与数字工程|基于BERT模型的生成式自动文本摘要

基于BERT模型的生成式自动文本摘要

周圆 张琨 陈智源 江浩俊 方自正

计算机与数字工程2024,Vol.52Issue(10):3052-3058,7.
计算机与数字工程2024,Vol.52Issue(10):3052-3058,7.DOI:10.3969/j.issn.1672-9722.2024.10.035

基于BERT模型的生成式自动文本摘要

Abstractive Summarization Based on BERT Model

周圆 1张琨 1陈智源 1江浩俊 1方自正1

作者信息

  • 1. 南京理工大学 南京 210094
  • 折叠

摘要

Abstract

With the continuous development of deep learning,pre-trained language models have achieved great results in the field of natural language processing.Of course,automatic text summarization,as an important research direction in natural lan-guage processing,also benefits from large-scale pre-trained language models.In particular,a large-scale pre-training language model is used to generate an abstractive summarization that can accurately reflect the main idea of the original text.However,there are still some problems in current research,such as insufficient understanding of the semantic information of the original document,unable to effectively represent polysemy,the generated abstract has repeated content,and the logicality is not strong.In order to al-leviate the above problems,this paper proposes a new generative text summarization model TextRank-BERT-PGN-Coverage(TB-PC)based on BERT pre-trained language model.The model uses classical Encoder-Decoder framework to pre-train weights and generate abstracts.In this experiment,CNN/Daily Mail dataset is used as the experimental dataset.Experimental results show that compared with the existing research results in this field,the model proposed in this paper achieves a better experimental result.

关键词

生成式文本摘要/TextRank算法/BERT模型/指针生成网络/覆盖机制

Key words

abstractive summarization/TextRank algorithm/BERT model/pointer generator network/coverage mechanism

分类

信息技术与安全科学

引用本文复制引用

周圆,张琨,陈智源,江浩俊,方自正..基于BERT模型的生成式自动文本摘要[J].计算机与数字工程,2024,52(10):3052-3058,7.

计算机与数字工程

OACSTPCD

1672-9722

访问量0
|
下载量0
段落导航相关论文