| 注册
首页|期刊导航|国防科技大学学报|多模态交叉解耦的少样本学习方法

多模态交叉解耦的少样本学习方法

冀中 王思迪 于云龙

国防科技大学学报2024,Vol.46Issue(1):12-21,10.
国防科技大学学报2024,Vol.46Issue(1):12-21,10.DOI:10.11887/j.cn.202401002

多模态交叉解耦的少样本学习方法

Multimodal cross-decoupling for few-shot learning

冀中 1王思迪 1于云龙2

作者信息

  • 1. 天津大学 电气自动化与信息工程学院,天津 300072
  • 2. 浙江大学 信息与电子工程学院,浙江 杭州 310027
  • 折叠

摘要

Abstract

Current multi-modal few-shot learning methods overlook the impact of inter-attribute differences on accurately recognizing sample categories.To address this problem,a multimodal cross-decoupling method was proposed which could decouple semantic features with different attributes and reconstruct the essential category features of samples,aiming to alleviate the impact of category attribute differences on category discrimination.Extensive experiments on two benchmark few-shot datasets MIT-States and C-GQA with large attribute discrepancy indicates that the proposed method outperforms the existing approaches,which fully verifies its effectiveness,indicating that the multimodal cross-decoupling few-shot learning method can improve the classification performance of identifying few test samples.

关键词

少样本学习/多模态学习/特征解耦/属性

Key words

few-shot learning/multimodal learning/feature decoupling/attribute

分类

计算机与自动化

引用本文复制引用

冀中,王思迪,于云龙..多模态交叉解耦的少样本学习方法[J].国防科技大学学报,2024,46(1):12-21,10.

基金项目

浙江省重点研发计划资助项目(2021C01119) (2021C01119)

国家自然科学基金资助项目(62176178,62002320,U19B2043) (62176178,62002320,U19B2043)

国防科技大学学报

OACSTPCD

1001-2486

访问量0
|
下载量0
段落导航相关论文