计算机应用研究2024,Vol.41Issue(11):3281-3287,7.DOI:10.19734/j.issn.1001-3695.2024.03.0084
基于潜在空间生成器的联邦知识蒸馏
Knowledge distillation in federated learning based on latent space generator
摘要
Abstract
User heterogeneity poses significant challenges to federated learning(FL),leading to global model bias and slow convergence.To address this problem,this paper proposed a method combining knowledge distillation and a latent space ge-nerator,called FedLSG.This method employed a central server to learn a generative model with a latent space generator that extracted and simulated the probability distribution of sample labels from different user devices,then generated richer and more diverse pseudo-samples to guide the training of user models.This approach aimed to effectively address the problem of user heterogeneity in FL.Theoretical analysis and experimental results show that FedLSG generally achieves about 1%higher test accuracy than the existing FedGen method,improves communication efficiency in the first 20 rounds,and provides a degree of user privacy protection.关键词
用户异质性/联邦学习/知识蒸馏/潜在空间生成器/概率分布Key words
user heterogeneity/federated learning/knowledge distillation/latent space generator/probability distribution分类
信息技术与安全科学引用本文复制引用
王虎,王晓峰,李可..基于潜在空间生成器的联邦知识蒸馏[J].计算机应用研究,2024,41(11):3281-3287,7.基金项目
国家自然科学基金资助项目(62062001) (62062001)
宁夏青年拔尖人才项目(2021) (2021)