计算机与数字工程2024,Vol.52Issue(4):1093-1097,1179,6.DOI:10.3969/j.issn.1672-9722.2024.04.023
基于注意力机制和知识蒸馏的小样本增量学习
Few-Shot Incremental Learning Based on Attention Mechanism and Knowledge Distillation
摘要
Abstract
Current few-shot learning mainly focuses on the performance on the few-shot categories data,while ignores the per-formance on the auxiliary set.To address this problem,a few-shot incremental learning model is proposed based on attention mecha-nism and knowledge distillation.The attention mechanism is adopted to learn the generalization ability on the few-shot data,the knowledge distillation is utilized to retain the discriminating ability on the auxiliary set.Therefore,the proposed model has an ac-cepted classification performance on both few-shot data and auxiliary set.Experiments show that the proposed model not only achieves excellent performance on few-shot data,but also does not suffer much performance loss on the auxiliary set.关键词
小样本学习/增量学习/注意力机制/知识蒸馏Key words
few-shot learning/incremental learning/attention mechanism/knowledge distillation分类
信息技术与安全科学引用本文复制引用
崔颖,徐晓峰,包象琳,刘传才..基于注意力机制和知识蒸馏的小样本增量学习[J].计算机与数字工程,2024,52(4):1093-1097,1179,6.基金项目
国家自然科学基金项目(编号:61373063,61872188,62172225) (编号:61373063,61872188,62172225)
安徽省自然科学基金项目(编号:2108085QF264,2108085QF268)资助. (编号:2108085QF264,2108085QF268)