| 注册
首页|期刊导航|电子科技学刊|De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques

De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques

Yan Li Tai-Kang Tian Meng-Yu Zhuang Yu-Ting Sun

电子科技学刊2024,Vol.22Issue(3):57-68,12.
电子科技学刊2024,Vol.22Issue(3):57-68,12.DOI:10.1016/j.jnlest.2024.100278

De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques

De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques

Yan Li 1Tai-Kang Tian 2Meng-Yu Zhuang 2Yu-Ting Sun3

作者信息

  • 1. School of Economics and Management,University of Electronic Science and Technology of China,Chengdu,611731,China
  • 2. School of Economics and Management,Beijing University of Posts and Telecommunication,Beijing,100876,China
  • 3. School of Electrical Engineering and Computer Science,The University of Queensland,Brisbane,4072,Australia
  • 折叠

摘要

关键词

De-biasing/Deep learning/Knowledge distillation/Model compression

Key words

De-biasing/Deep learning/Knowledge distillation/Model compression

引用本文复制引用

Yan Li,Tai-Kang Tian,Meng-Yu Zhuang,Yu-Ting Sun..De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques[J].电子科技学刊,2024,22(3):57-68,12.

基金项目

This work was supported by the National Natural Science Foundation of China under Grant No.62172056 ()

Young Elite Scientists Sponsorship Program by CAST under Grant No.2022QNRC001. ()

电子科技学刊

1674-862X

访问量0
|
下载量0
段落导航相关论文