计算机工程与应用2025,Vol.61Issue(6):295-303,9.DOI:10.3778/j.issn.1002-8331.2311-0170
自适应分离知识蒸馏的遥感目标检测
Adaptive Separation of Knowledge Distillation for Remote Sensing Object Detection
摘要
Abstract
In recent years,deep models have achieved great success in large-scale applications,but issues such as compu-tational complexity and storage requirements make them difficult to deploy on resource-limited devices.Knowledge distil-lation(KD)is a method for compressing model,however,existing methods do not consider the characteristics of remote sensing datasets.Specifically,in remote sensing datasets due to the complex background and small target objects in the images,a large amount of noise occurs when applying the existing knowledge distillation methods directly,which affects the training performance.Therefore,the adaptive separation of knowledge distillation(ASKD)method is proposed.ASKD allows the student model to automatically select multi-scale core features to reduce noise,and at the same time effectively suppresses background interference by separating global and local features.ASKD achieves excellent perfor-mance on both single-stage and two-stage detectors on both LEVIR and SSDD datasets.For example,based on Faster RCNN of ResNet-18,ASKD achieves 59.2%mAP on SSDD,which is 2.0 percentage points higher than the baseline model and even better than the teacher model.关键词
知识蒸馏/遥感/轻量级目标检测Key words
knowledge distillation/remote sensing/lightweight object detection分类
信息技术与安全科学引用本文复制引用
杨晓雨,顾进广..自适应分离知识蒸馏的遥感目标检测[J].计算机工程与应用,2025,61(6):295-303,9.基金项目
国家重点研发计划(2022YFC3300800). (2022YFC3300800)