| 注册
首页|期刊导航|计算机应用研究|基于潜在空间生成器的联邦知识蒸馏

基于潜在空间生成器的联邦知识蒸馏

王虎 王晓峰 李可

计算机应用研究2024,Vol.41Issue(11):3281-3287,7.
计算机应用研究2024,Vol.41Issue(11):3281-3287,7.DOI:10.19734/j.issn.1001-3695.2024.03.0084

基于潜在空间生成器的联邦知识蒸馏

Knowledge distillation in federated learning based on latent space generator

王虎 1王晓峰 2李可1

作者信息

  • 1. 北方民族大学 计算机科学与工程学院 银川 750021
  • 2. 北方民族大学 计算机科学与工程学院 银川 750021||北方民族大学 图形图像智能处理国家民委重点实验室,银川 750021
  • 折叠

摘要

Abstract

User heterogeneity poses significant challenges to federated learning(FL),leading to global model bias and slow convergence.To address this problem,this paper proposed a method combining knowledge distillation and a latent space ge-nerator,called FedLSG.This method employed a central server to learn a generative model with a latent space generator that extracted and simulated the probability distribution of sample labels from different user devices,then generated richer and more diverse pseudo-samples to guide the training of user models.This approach aimed to effectively address the problem of user heterogeneity in FL.Theoretical analysis and experimental results show that FedLSG generally achieves about 1%higher test accuracy than the existing FedGen method,improves communication efficiency in the first 20 rounds,and provides a degree of user privacy protection.

关键词

用户异质性/联邦学习/知识蒸馏/潜在空间生成器/概率分布

Key words

user heterogeneity/federated learning/knowledge distillation/latent space generator/probability distribution

分类

信息技术与安全科学

引用本文复制引用

王虎,王晓峰,李可..基于潜在空间生成器的联邦知识蒸馏[J].计算机应用研究,2024,41(11):3281-3287,7.

基金项目

国家自然科学基金资助项目(62062001) (62062001)

宁夏青年拔尖人才项目(2021) (2021)

计算机应用研究

OA北大核心CSTPCD

1001-3695

访问量0
|
下载量0
段落导航相关论文