| 注册
首页|期刊导航|计算机与数字工程|基于上下文一致性的联邦大模型参数高效微调技术

基于上下文一致性的联邦大模型参数高效微调技术

蒋大锐 贺敏伟 徐胜超

计算机与数字工程2025,Vol.53Issue(4):1051-1055,1069,6.
计算机与数字工程2025,Vol.53Issue(4):1051-1055,1069,6.DOI:10.3969/j.issn.1672-9722.2025.04.023

基于上下文一致性的联邦大模型参数高效微调技术

Efficient Parameter Fine Tuning Techniques for Federated Large Models Based on Context Consistency

蒋大锐 1贺敏伟 1徐胜超1

作者信息

  • 1. 广州华商学院人工智能学院 广州 511300
  • 折叠

摘要

Abstract

In order to improve the performance of federated models and reduce model resource consumption,an efficient pa-rameter tuning technique for federated models based on context consistency is designed.This paper reduces training time and compu-tational resource consumption for efficient parameter tuning by detecting and handling context consistency errors.This paper devel-ops a resource-saving parameter efficient fine-tuning method,constructs an independent lightweight branch path outside the basic model,takes the mid-level features of the basic model as input,achieves the fusion of pre-training knowledge of the backbone mod-el,and completes efficient parameter fine-tuning.Before extracting intermediate level features,dynamic iterative pruning method is used to perform federated pruning on the intermediate layer to improve feature extraction efficiency.The test results show that through the application of design methods,the FLOPs of the ViT-B/16 model on different image domain test sets are less than 1 bil-lion times,and the video memory usage is only 6.0 GB.

关键词

上下文一致性/联邦大模型/轻量级分支/参数微调/动态迭代剪枝

Key words

contextual consistency/federal big model/lightweight branch/parameter fine-tuning/dynamic iterative prun-ing

分类

信息技术与安全科学

引用本文复制引用

蒋大锐,贺敏伟,徐胜超..基于上下文一致性的联邦大模型参数高效微调技术[J].计算机与数字工程,2025,53(4):1051-1055,1069,6.

基金项目

国家自然科学基金面上项目(编号:61972444) (编号:61972444)

广州华商学院校内导师制科研项目(编号:2024HSDS27)资助. (编号:2024HSDS27)

计算机与数字工程

1672-9722

访问量3
|
下载量0
段落导航相关论文