| 注册
首页|期刊导航|计算机工程|基于实例谱关系的知识蒸馏

基于实例谱关系的知识蒸馏

张政秀 周淳 杨萌

计算机工程2025,Vol.51Issue(11):63-71,9.
计算机工程2025,Vol.51Issue(11):63-71,9.DOI:10.19678/j.issn.1000-3428.0069690

基于实例谱关系的知识蒸馏

Knowledge Distillation Based on Instance Spectral Relations

张政秀 1周淳 1杨萌1

作者信息

  • 1. 西南交通大学信息科学与技术学院,四川成都 611756
  • 折叠

摘要

Abstract

The core challenge of Knowledge Distillation(KD)lies in extracting generic and sufficient knowledge from the Teacher model to effectively guide the learning of the Student model.Recent studies have found that building upon learning soft labels,further exploration of inter-instance relations in the deep feature space contributes to enhancing the performance of Student models.Existing inter-instance relation-based KD methods widely adopt global Euclidean distance metrics to measure the affinity between instances.However,these methods overlook the intrinsic high-dimensional embedding characteristics of the deep feature space,where data is distributed on a low-dimensional manifold,exhibiting locally Euclidean-like structures but with complex global structures.To address this issue,a novel instance spectrum relation-based KD method is proposed.This strategy eliminates the limitations of the global Euclidean distance and instead constructs and analyzes similarity matrices between each instance and its k-nearest neighbor in the Teacher model's feature space to reveal potential spectral graph structure information.An innovative loss function is designed to guide the Student model to learn not only the probability distribution output by the Teacher model but also simulate the inter-instance relation represented by this spectral graph structure.The experimental results demonstrate that the proposed method significantly improves the performance of the Student model,with an average classification accuracy improvement of 2.33 percentage points compared with baseline methods.These findings strongly indicate the importance and effectiveness of incorporating the spectral graph structure relation between samples in the KD process.

关键词

知识蒸馏/注意力转移/实例谱关系/谱图结构/流形学习

Key words

Knowledge Distillation(KD)/attention transfer/instance spectral relation/spectral graph structure/manifold learning

分类

信息技术与安全科学

引用本文复制引用

张政秀,周淳,杨萌..基于实例谱关系的知识蒸馏[J].计算机工程,2025,51(11):63-71,9.

基金项目

航空科学基金(2023Z071109001) (2023Z071109001)

中央高校基本科研业务费专项资金(2682023ZTPY021,2682023ZTPY027). (2682023ZTPY021,2682023ZTPY027)

计算机工程

OA北大核心

1000-3428

访问量1
|
下载量0
段落导航相关论文