| 注册
首页|期刊导航|计算机工程|融合卷积收缩门控的生成式文本摘要方法

融合卷积收缩门控的生成式文本摘要方法

甘陈敏 唐宏 杨浩澜 刘小洁 刘杰

计算机工程2024,Vol.50Issue(2):98-104,7.
计算机工程2024,Vol.50Issue(2):98-104,7.DOI:10.19678/j.issn.1000-3428.0066847

融合卷积收缩门控的生成式文本摘要方法

Abstractive Text Summarization Method Incorporating Convolutional Shrinkage Gating

甘陈敏 1唐宏 1杨浩澜 1刘小洁 1刘杰1

作者信息

  • 1. 重庆邮电大学通信与信息工程学院,重庆 400065||重庆邮电大学移动通信技术重庆市重点实验室,重庆 400065
  • 折叠

摘要

Abstract

Driven by deep learning techniques,Sequence to Sequence(Seq2Seq)model,based on an encoder-decoder architecture combined with an attention mechanism,is widely utilized in text summarization research,particularly for abstractive text summarization tasks.Remarkable results are achieved by this model.However,limitations are faced by existing models using Recurrent Neural Network(RNN),such as insufficient parallelism,low time efficiency,and a tendency to produce summaries that are either redundant,repetitive,or semantically irrelevant.Additionally,these models often fail to fully summarize useful information and ignore the connection between words and sentences.In response to these challenges,a text summarization method based on Transformer and convolutional shrinkage gating is proposed.Different levels of text representations are extracted using BERT as an encoder,which then obtains contextual encoding.The convolutional shrinkage gating unit is adopted to adjust encoding weights,strengthen global relevance,remove interference from useless information,and obtain the final encoding output after filtering.Three different decoders are designed:the basic Transformer decoding module,the decoding module with a shared encoder,and the decoding module using GPT.These are aimed at strengthening the association between encoder and decoder and exploring model structures capable of generating high-quality abstracts.Evaluation scores of the TCSG,ES-TCSG,and GPT-TCSG models in this method are shown to increment by no less than 1.0 on both LCSTS and CNNDM datasets,verifying the validity and feasibility of the method relative to mainstream benchmark models.

关键词

生成式文本摘要/序列到序列模型/Transformer模型/BERT编码器/卷积收缩门控单元/解码器

Key words

abstractive text summarization/Sequence to Sequence(Seq2Seq)model/Transformer model/BERT encoder/convolutional shrinkage gating unit/decoder

分类

信息技术与安全科学

引用本文复制引用

甘陈敏,唐宏,杨浩澜,刘小洁,刘杰..融合卷积收缩门控的生成式文本摘要方法[J].计算机工程,2024,50(2):98-104,7.

基金项目

长江学者和创新团队发展计划(IRT_16R72). (IRT_16R72)

计算机工程

OA北大核心CSTPCD

1000-3428

访问量0
|
下载量0
段落导航相关论文