摘要
Abstract
Federated learning as an emerging distributed computing paradigm with privacy protection,safeguards user privacy and data security to a certain extent.However,in federated learning systems,the frequent exchange of model parameters be-tween clients and servers results in significant communication overhead.In bandwidth-limited wireless communication scenari-os,this has become the primary bottleneck restricting the development of federated learning.To solve this problem,this paper proposed a dynamic sparse compression algorithm based on Z-Score.By utilizing Z-Score,it performed outlier detection on lo-cal model updates,considering significant update values as outliers and subsequently selecting them.Without complex sorting algorithms or prior knowledge of the original model updates,it achieved model update sparsification.At the same time,with the increase of communication rounds,the sparse rate was dynamically adjusted according to the loss value of the global model to minimize the total traffic while ensuring the accuracy of the model.Experiments show that in the I.I.D.data scenario,the proposed algorithm can reduce communication traffic by 95%compared with the federated average algorithm,and the accuracy loss is only 1.6%.Additionally,compared with the FTTQ algorithm,the proposed algorithm can also reduce communication traffic by 40%~50%,with only 1.29%decrease in accuracy.It proves that the method can significantly reduce the commu-nication cost while ensuring the performance of the model.关键词
联邦学习/Z-Score/稀疏化/动态稀疏率Key words
federated learning/Z-Score/sparsification/dynamic sparsity分类
信息技术与安全科学