| 注册
首页|期刊导航|计算机科学与探索|大模型微调的多领域机器翻译方法综述

大模型微调的多领域机器翻译方法综述

陈子建 王斯日古楞 斯琴图

计算机科学与探索2025,Vol.19Issue(4):916-928,13.
计算机科学与探索2025,Vol.19Issue(4):916-928,13.DOI:10.3778/j.issn.1673-9418.2410032

大模型微调的多领域机器翻译方法综述

Survey of Multi-domain Machine Translation Methods for Fine-Tuning Large Models

陈子建 1王斯日古楞 1斯琴图1

作者信息

  • 1. 内蒙古师范大学 计算机科学技术学院,呼和浩特 010022
  • 折叠

摘要

Abstract

With the rapid development of machine translation technology,machine translation methods based on pre-trained large models have occupied an important position in the field of natural language processing.However,due to the significant differences in language features,lexical styles and expressions between different domains,it is difficult for a single pre-trained model to achieve efficient and stable performance in multi-domain translation tasks.Therefore,this paper focuses on the key issues of large model fine-tuning technology in multi-domain machine translation tasks,systematically reviews the core principles,main methods and application effects of fine-tuning technology,and focuses on analyzing the performance and applicability scenarios of three types of strategies,namely full-parameter fine-tuning,parameter-efficient fine-tuning,and prompt-tuning.This paper discusses the advantages and limitations of different fine-tuning methods in depth,focusing on how to balance the domain generalization ability and task specificity through efficient fine-tuning strategies under resource-constrained conditions,and demonstrating the significant advantages of parameter-efficient fine-tuning and prompt-tuning in terms of resource utilization efficiency and domain adaptability.The practical effects of different fine-tuning strategies in terms of domain migration and resource utilization are further evaluated through comparative analysis and experimental validation,and their effectiveness is verified through case studies.Future research directions should focus on the efficient utilization of resources,the domain adaptive capability of models,and the improvement of translation quality and robustness,so as to promote the continuous development of multi-domain machine translation systems in terms of performance and adaptability.

关键词

大模型微调/多领域机器翻译/全参数微调/参数高效微调/提示微调

Key words

large model fine-tuning/multi-domain machine translation/full-parameter fine-tuning/parameter-efficient fine-tuning/prompt-tuning

分类

计算机与自动化

引用本文复制引用

陈子建,王斯日古楞,斯琴图..大模型微调的多领域机器翻译方法综述[J].计算机科学与探索,2025,19(4):916-928,13.

基金项目

国家自然科学基金(61762072) (61762072)

内蒙古自然科学基金(2022MS06002,2024LHMS06024).This work was supported by the National Natural Science Foundation of China(61762072),and the Natural Science Foundation of Inner Mongolia(2022MS06002,2024LHMS06024). (2022MS06002,2024LHMS06024)

计算机科学与探索

OA北大核心

1673-9418

访问量0
|
下载量0
段落导航相关论文