| 注册
首页|期刊导航|计算机与数字工程|基于注意力机制和知识蒸馏的小样本增量学习

基于注意力机制和知识蒸馏的小样本增量学习

崔颖 徐晓峰 包象琳 刘传才

计算机与数字工程2024,Vol.52Issue(4):1093-1097,1179,6.
计算机与数字工程2024,Vol.52Issue(4):1093-1097,1179,6.DOI:10.3969/j.issn.1672-9722.2024.04.023

基于注意力机制和知识蒸馏的小样本增量学习

Few-Shot Incremental Learning Based on Attention Mechanism and Knowledge Distillation

崔颖 1徐晓峰 2包象琳 2刘传才1

作者信息

  • 1. 南京理工大学计算机科学与工程学院 南京 210094
  • 2. 安徽工程大学计算机与信息学院 芜湖 241000
  • 折叠

摘要

Abstract

Current few-shot learning mainly focuses on the performance on the few-shot categories data,while ignores the per-formance on the auxiliary set.To address this problem,a few-shot incremental learning model is proposed based on attention mecha-nism and knowledge distillation.The attention mechanism is adopted to learn the generalization ability on the few-shot data,the knowledge distillation is utilized to retain the discriminating ability on the auxiliary set.Therefore,the proposed model has an ac-cepted classification performance on both few-shot data and auxiliary set.Experiments show that the proposed model not only achieves excellent performance on few-shot data,but also does not suffer much performance loss on the auxiliary set.

关键词

小样本学习/增量学习/注意力机制/知识蒸馏

Key words

few-shot learning/incremental learning/attention mechanism/knowledge distillation

分类

信息技术与安全科学

引用本文复制引用

崔颖,徐晓峰,包象琳,刘传才..基于注意力机制和知识蒸馏的小样本增量学习[J].计算机与数字工程,2024,52(4):1093-1097,1179,6.

基金项目

国家自然科学基金项目(编号:61373063,61872188,62172225) (编号:61373063,61872188,62172225)

安徽省自然科学基金项目(编号:2108085QF264,2108085QF268)资助. (编号:2108085QF264,2108085QF268)

计算机与数字工程

OACSTPCD

1672-9722

访问量0
|
下载量0
段落导航相关论文