| 注册
首页|期刊导航|电子学报|基于轻量自蒸馏的低成本联邦学习

基于轻量自蒸馏的低成本联邦学习

刘松 罗杨宇 许佳培 张建忠

电子学报2025,Vol.53Issue(1):259-269,11.
电子学报2025,Vol.53Issue(1):259-269,11.DOI:10.12263/DZXB.20240325

基于轻量自蒸馏的低成本联邦学习

Low-Cost Federated Learning Based on Lightweight Self-Distillation

刘松 1罗杨宇 2许佳培 1张建忠2

作者信息

  • 1. 南开大学计算机学院,天津 300350||数据与智能系统安全教育部重点实验室,天津 300350
  • 2. 南开大学网络空间安全学院,天津 300350||数据与智能系统安全教育部重点实验室,天津 300350
  • 折叠

摘要

Abstract

With the development of edge computing,the training of deep learning models increasingly relies on the privacy data generated by a large number of edge devices.In this context,federated learning has drawn extensive attention from both academia and industry due to its prominent privacy protection capabilities.However,in practice,federated learn-ing faces challenges such as inefficient training and suboptimal model quality due to data heterogeneity and limited compu-tational resources.Inspired by the concept of knowledge distillation,this paper proposes an efficient federated learning algo-rithm,named efficient federated learning with lightweight self knowledge distillation(FedSKD).This algorithm utilizes lightweight self-distillation techniques to extract intrinsic knowledge during the training process,alleviating local model overfitting and enhancing its generalization capability.Furthermore,it aggregates the generalization capability of local mod-els to a global model through server parameter aggregation,thereby improving the quality and convergence speed of the global model.Additionally,by employing a dynamic synchronization mechanism,it further enhances the accuracy and train-ing efficiency of the global model.Experimental results demonstrate that FedSKD algorithm,under non-identically distribut-ed data partition strategies,enhances model accuracy and training efficiency while reducing computational costs.On the CI-FAR10/100,compared to the latest baseline FedMLD,the FedSKD achieved an average 2%improvement in accuracy and reduced the training cost by an average of 56%.

关键词

联邦学习/自蒸馏/非独立同分布/深度学习/边缘计算

Key words

federal learning/self-distillation/non-independent distribution/deep learning/edge computing

分类

计算机与自动化

引用本文复制引用

刘松,罗杨宇,许佳培,张建忠..基于轻量自蒸馏的低成本联邦学习[J].电子学报,2025,53(1):259-269,11.

基金项目

天津市科技重大专项与工程(No.18ZXZNGX00200) Technology Research and Development Program of Tianjin(No.18ZXZNGX00200) (No.18ZXZNGX00200)

电子学报

OA北大核心

0372-2112

访问量0
|
下载量3
段落导航相关论文