|国家科技期刊平台
首页|期刊导航|山西大学学报(自然科学版)|基于样本旋转的生成困难样本的深度度量学习方法

基于样本旋转的生成困难样本的深度度量学习方法OA北大核心CSTPCD

Sample Rotation-based Hard Sample-Generating Methods for Deep Metric Learning

中文摘要英文摘要

现有深度度量学习方法通过构造困难样本生成方法指导模型高效训练,基于代数计算的困难样本生成方法具有简单、高效的优势.然而这类方法缺乏对数据整体分布的考虑,导致生成的样本随机性较强、模型收敛缓慢.针对该问题,将三元组中的正样本以其所属的类中心为轴,旋转至锚点与该类中心连线的反向延长线上,提出一种基于样本旋转的困难样本生成方法,给出了一种新的损失函数,构建了一种基于样本旋转的生成困难样本的深度度量学习模型(RHS-DML),有效提升了模型的训练效率.在Cars196,CUB200-2011以及Stanford Online Products数据集上进行了图像检索的实验,与代数计算方法中对称生成样本方法进行了比较,结果表明,本文提出算法的检索性能相较于对称样本生成方法,在三个数据集上分别高出2.4%,0.7%,1.4%.

Existing deep metric learning methods guide efficient training of the model by constructing hard sample generation meth-ods.The hard sample generation methods based on algebraic computation have the advantages of simplicity and efficiency.Howev-er,such methods lack consideration of the overall data distribution,resulting in strong randomness of the generated samples and slow convergence of the model.To address this problem,we propose a hard sample generation method based on sample rotation by rotating positive samples in a triad to the reverse extension of the line connecting the anchor point and the class center on the axis of the class to which they belong,and give a new loss function to construct a deep metric learning model(RHS-DML)for generating hard samples based on sample rotation,effectively improving the training efficiency of the model.Experiments on image retrieval were conducted on the Cars196,CUB200-2011,and Stanford Online Products datasets,and compared with the symmetric sample generation method in algebraic computing.The results showed that the retrieval performance of the algorithm proposed was 2.4%,0.7%,and 1.4%higher than the symmetric sample generation cost method on the three datasets,respectively.

张鸽;闫京;魏巍;梁吉业

山西大学 计算机与信息技术学院,山西 太原 030006||山西大学 计算智能与中文信息处理教育部重点实验室,山西 太原 030006

计算机与自动化

深度度量学习困难样本生成多类N元组损失代数计算

deep metric learninghard sample generationmulti-class n-pair lossalgebraic calculations

《山西大学学报(自然科学版)》 2024 (005)

973-981 / 9

国家自然科学基金(61976184)

10.13451/j.sxu.ns.2023106

评论