| 注册
首页|期刊导航|计算机与数字工程|基于差分隐私的大语言模型指令微调技术

基于差分隐私的大语言模型指令微调技术

蒋金陵 徐胜超 杨波 毛明扬 蒋大锐

计算机与数字工程2025,Vol.53Issue(2):493-498,6.
计算机与数字工程2025,Vol.53Issue(2):493-498,6.DOI:10.3969/j.issn.1672-9722.2025.02.034

基于差分隐私的大语言模型指令微调技术

Instruction Fine tuning Techniques for Large Language Models Based on Differential Privacy

蒋金陵 1徐胜超 1杨波 1毛明扬 1蒋大锐1

作者信息

  • 1. 广州华商学院人工智能学院 广州 511300
  • 折叠

摘要

Abstract

Due to the large amount of data information required for large language models,privacy data leakage often occurs when designing instruction fine-tuning techniques for large language models,resulting in poor fine-tuning performance of the tech-nology.In response,a large language model instruction fine-tuning technique based on differential privacy is proposed.Under the influence of differential privacy,this paper calculates the sensitivity of the instruction dataset,then calculates the size of the intro-duced random noise,and adds random noise to the instruction dataset.This paper reads a large number of model parameters from it,sets the loss function of the model,updates the model parameters through gradient values,and calculates the model instruction fine-tuning parameters.By calculating the evaluation value of the model,the performance of the model after initial fine-tuning is de-termined.Then,a low rank matrix is introduced to perform secondary fine-tuning on the large language model,achieving perfor-mance optimization of the model.The experimental results show that the designed fine-tuning technique has an average perplexity of 0.35 in practical applications,indicating good fine-tuning performance.

关键词

差分隐私/大语言模型/指令微调/微调策略/微调参数/数据隐私/随机噪声

Key words

differential privacy/large language model(LLM)/instruction fine-tuning/fine tuning strategy/fine tuning pa-rameters/data privacy/random noise

分类

计算机与自动化

引用本文复制引用

蒋金陵,徐胜超,杨波,毛明扬,蒋大锐..基于差分隐私的大语言模型指令微调技术[J].计算机与数字工程,2025,53(2):493-498,6.

基金项目

国家自然科学基金面上项目(编号:61972444) (编号:61972444)

广州华商学院校内科研导师制项目(编号:2023HSDS28)资助. (编号:2023HSDS28)

计算机与数字工程

1672-9722

访问量0
|
下载量0
段落导航相关论文