常州大学学报(社会科学版)2025,Vol.26Issue(4):12-21,10.DOI:10.3969/j.issn.2095-042X.2025.04.002
法律大语言模型的幻觉困境与出路
The Dilemma and Solution for Legal Large Language Model Hallucination
摘要
Abstract
In recent years,large language models have shown excellent performance in natural language processing tasks,bringing new opportunities for the development of legal artificial intelligence.The legal large language model generated by fine tuning the general-purpose foundation model can complete legal tasks such as understanding legal information,answering legal knowledge questions,and generating legal texts.However,for the hallucination phenomena that often occur in the large language models,the current legal large language models appear to be more serious and is falling into the hallucination dilemma.It is necessary to clarify the factors that lead to frequent hallucination in the legal large language models,and standardize and improve them to meet the accuracy requirements of legal tasks.关键词
法律大语言模型/法律人工智能/大模型幻觉Key words
legal large language model/legal artificial intelligence/hallucination in large model分类
社会科学引用本文复制引用
曹全来,王思涵..法律大语言模型的幻觉困境与出路[J].常州大学学报(社会科学版),2025,26(4):12-21,10.基金项目
司法部国家法治与法学理论研究重点课题"习近平法治思想与国家治理法治化路径及对策研究"(21SFB1001) (21SFB1001)
最高人民法院司法研究重大课题"数字法院建设问题研究"(GFZDKT2024B29-2). (GFZDKT2024B29-2)