计算机科学与探索2023,Vol.17Issue(11):2721-2733,13.DOI:10.3778/j.issn.1673-9418.2204107
多教师对比知识反演的无数据模型压缩方法
Multi-teacher Contrastive Knowledge Inversion for Data-Free Distillation
摘要
关键词
模型压缩/无数据/知识蒸馏/数据保护/隐私保护Key words
model compression/data-free/knowledge distillation/data protection/privacy protection分类
信息技术与安全科学引用本文复制引用
林振元,林绍辉,姚益武,何高奇,王长波,马利庄..多教师对比知识反演的无数据模型压缩方法[J].计算机科学与探索,2023,17(11):2721-2733,13.基金项目
国家自然科学基金(72192821,62102151) (72192821,62102151)
上海市科技委扬帆计划项目(21YF1411200) (21YF1411200)
中国人工智能学会-华为Mind-Spore学术奖励基金(CAAIXSJLJJ-2021-031A). This work was supported by the National Natural Science Foundation of China(72192821,62102151),the Sailing Program of the Science and Technology Commission of Shanghai(21YF1411200),and the CAAI-Huawei MindSpore Open Fund(CAAIXSJLJJ-2021-031A). (CAAIXSJLJJ-2021-031A)