计算机技术与发展2024,Vol.34Issue(1):52-58,7.DOI:10.3969/j.issn.1673-629X.2024.01.008
基于类间排名相关性的解耦知识蒸馏
Decoupled Knowledge Distillation Based on Inter-class Ranking Correlation
摘要
Abstract
Knowledge distillation has achieved great success since it was proposed,but many distillation strategies focus on the characteristics of the hidden layers and ignore the developability of logit distillation.The decoupled of knowledge distillation makes logit distillation return to public view.Both knowledge distillation and decoupled knowledge distillation use strong consistency constraints to make the distillation effect sub-optimal,especially when the teacher network and student network structure are different.To solve this problem,a method based on consistency of ranking relation between inter-classes is proposed.In this method,the relationship between teacher and student non-target class prediction is retained,and the correlation between class ranking is used as the relationship between agent loss and evaluation index in knowledge distillation model,so as to match the relationship between teacher network and student network.In this method,the relatively easy relation matching is extended to decoupled knowledge distillation and verified in CIFAR-100 and ImageNet-1K datasets.The experimental results show that the classification accuracy of the proposed method for dataset CIFAR-100 reaches77.38%,which is0.93%higher than that of the benchmark method.The effect of decoupling knowledge distillation image classification is improved,which verifies the effectiveness of the proposed method.At the same time,the results of comparative experiments show that it is more competitive.关键词
知识蒸馏/解耦知识蒸馏/强一致性约束/关系匹配/排名相关性Key words
knowledge distillation/decoupled knowledge distillation/strong consistency constraint/relation match/ranking correlation分类
信息技术与安全科学引用本文复制引用
陈颖,朱子奇,徐仕成,李敏..基于类间排名相关性的解耦知识蒸馏[J].计算机技术与发展,2024,34(1):52-58,7.基金项目
国家自然科学基金资助项目(61702382) (61702382)