通信学报2025,Vol.46Issue(11):114-126,13.DOI:10.11959/j.issn.1000-436x.2025219
面向动态算力节点的联邦学习差分隐私重校准方法
Federated learning with differential privacy recalibration for dynamic computing nodes
摘要
Abstract
To address the issues of privacy budget overrun,low communication efficiency,and high training latency caused by the dynamic participation of computing nodes in computing networks,a federated learning differential privacy recalibration method was proposed.Firstly,a dynamic privacy budget calibration mechanism was designed by modeling node-exit probabilities and applying real-time budget recycling,along with adaptive noise-intensity adjustments during training.Subsequently,it constructs a contribution-driven sparse gradient encoding protocol that filters critical param-eters based on gradient importance weights,employing layered noise injection and 8 bit quantization compression to sig-nificantly reduce communication overhead.Simultaneously,a computing-capacity-aware batch adjustment algorithm dy-namically allocates local batch sizes according to device computational capabilities to minimize latency.Experiments demonstrate that this method achieves 30.1%privacy budget savings under dynamic node variations while reducing com-munication volume by 19.6%and maintaining comparable model performance,effectively enhancing model accuracy,communication efficiency,and system robustness.关键词
算力网络/联邦学习/差分隐私/稀疏梯度编码Key words
computing power network/federated learning/differential privacy/sparse gradient encoding分类
信息技术与安全科学引用本文复制引用
陈宁江,郑泽章,章德华..面向动态算力节点的联邦学习差分隐私重校准方法[J].通信学报,2025,46(11):114-126,13.基金项目
国家自然科学基金资助项目(No.62162003)The National Natural Science Foundation of China(No.62162003) (No.62162003)