计算机应用研究2025,Vol.42Issue(9):2621-2630,10.DOI:10.19734/j.issn.1001-3695.2025.03.0044
主题结构增强的大模型实体共指消解方法
Topic structure enhanced entity coreference resolution in large language models
摘要
Abstract
To address the suboptimal performance of large-scale pre-trained language model(LLM)-based entity coreference resolution(ECR)methods on long texts and the high computational cost associated with full-parameter fine-tuning,this study developed a topic-structure-enhanced ECR model leveraging prompt-based learning.The model utilized contextual topic struc-ture information to improve the capture of long-range coreference relations.Additionally,a learnable prompt template signifi-cantly reduced the computational overhead during fine-tuning.Experimental results demonstrated that the proposed method out-performed baseline models by 2.3,0.5,and 2.6 percentage points on three respective public datasets.Furthermore,compared to state-of-the-art models such as Link-Append and Seq2seqCoref,the proposed method achieved approximately 98%of their performance level while using only about 1.1%of the parameters.This demonstrates the model's effectiveness and significant computational efficiency for long-text ECR tasks.关键词
实体共指消解/主题模型/提示学习/预训练大模型Key words
entity coreference resolution/topic model/prompt-based learning/LLMs分类
信息技术与安全科学引用本文复制引用
刘小明,吴彦博,杨关,刘杰,吴佳昊..主题结构增强的大模型实体共指消解方法[J].计算机应用研究,2025,42(9):2621-2630,10.基金项目
"新一代人工智能"国家科技重大专项资助项目(2020AAA0109703) (2020AAA0109703)
国家自然科学基金联合基金重点项目(U23B2029) (U23B2029)
国家自然科学基金资助项目(62076167,61772020) (62076167,61772020)
河南省高等学校重点科研项目(24A520058,24A520060,23A520022) (24A520058,24A520060,23A520022)
河南省研究生教育改革与质量提升工程项目(YJS2024AL053) (YJS2024AL053)