|国家科技期刊平台
首页|期刊导航|红外技术|基于三分支对抗学习和补偿注意力的红外和可见光图像融合

基于三分支对抗学习和补偿注意力的红外和可见光图像融合OA北大核心CSTPCD

Infrared and Visible Image Fusion Based on Three-branch Adversarial Learning and Compensation Attention Mechanism

中文摘要英文摘要

针对现有深度学习图像融合方法依赖卷积提取特征,并未考虑源图像全局特征,融合结果容易产生纹理模糊、对比度低等问题,本文提出一种基于三分支对抗学习和补偿注意力的红外和可见光图像融合方法.首先,生成器网络采用密集块和补偿注意力机制构建局部-全局三分支提取特征信息.然后,利用通道特征和空间特征变化构建补偿注意力机制提取全局信息,更进一步提取红外目标和可见光细节表征.其次,设计聚焦双对抗鉴别器,以确定融合结果和源图像之间的相似分布.最后,选用公开数据集TNO和RoadScene进行实验并与其他 9 种具有代表性的图像融合方法进行对比,本文提出的方法不仅获得纹理细节更清晰、对比度更好的融合结果,而且客观度量指标优于其他先进方法.

The existing deep learning image fusion methods rely on convolution to extract features and do not consider the global features of the source image.Moreover,the fusion results are prone to texture blurring,low contrast,etc.Therefore,this study proposes an infrared and visible image fusion method with adversarial learning and compensated attention.First,the generator network uses dense blocks and the compensated attention mechanism to construct three local-global branches to extract feature information.The compensated attention mechanism is then constructed using channel features and spatial feature variations to extract global information,infrared targets,and visible light detail representations.Subsequently,a focusing dual-adversarial discriminator is designed to determine the similarity distribution between the fusion result and source image.Finally,the public dataset TNO and RoadScene are selected for the experiments and compared with nine representative image fusion methods.The method proposed in this study not only obtains fusion results with clearer texture details and better contrast,but also outperforms other advanced methods in terms of the objective metrics.

邸敬;任莉;刘冀钊;郭文庆;廉敬

兰州交通大学 电子与信息工程学院,甘肃 兰州 730070兰州大学 信息科学与工程学院,甘肃 兰州 730000

计算机与自动化

红外可见光图像融合局部-全局三分支局部特征提取补偿注意力机制对抗学习聚焦双对抗鉴别器

infrared-visible image fusionlocal-global three-branchlocal feature extractioncompensated attention mechanismadversarial learningfocused dual adversarial discriminator

《红外技术》 2024 (005)

510-521 / 12

国家自然科学基金(62061023);甘肃省杰出青年基金(21JR7RA345);甘肃省科技计划资助项目(22JR5RA360).

评论