基于双支路拮抗网络的偏振方向图像融合方法OA北大核心CSTPCD
Fusion Method for Polarization Direction Image Based on Double-branch Antagonism Network
为了提升偏振方向图像融合效果,构建了一种偏振方向图像的双支路拮抗融合网络(Double-branch Antagonism Network,DANet),该网络主要包括特征提取、特征融合和特征转化 3个模块.首先,特征提取模块由低频支路和高频支路组成,将 0°、45°、90°和 135°偏振方向图像连接输入到低频支路,提取图像能量特征,将 2组拮抗图像差分输入到高频支路,提取图像细节特征;其次,将得到的能量特征和细节特征进行特征融合;最后,将融合后的特征转化整合为融合图像.实验表明,通过此网络得到的融合图像,其视觉效果和评价指标均有较为显著的提升,与合成强度图像 I、偏振拮抗图像Sd、Sdd、Sh、Sv相比,在平均梯度、信息熵、空间频率和图像灰度均值上,分别至少提升了 22.16%、9.23%、23.44%和 38.71%.
To improve the quality of the fused image,the study presents a double-branch antagonism network(DANet)for the polarization direction images.The network includes three main modules:feature extraction,fusion,and transformation.First,the feature extraction module incorporates low and high-frequency branches,and the polarization direction images of 0°,45°,90°,and 135° are concatenated and imported to the low-frequency branch to extract energy features.Two sets of polarization antagonism images(0°,90°,45°,and 135°)are subtracted and entered into the high-frequency branch to extract detailed features and energy.Detailed features are fused to feature maps.Finally,the feature maps were transformed into fused images.Experiment results show that the fusion images obtained by DANet make obvious progress in visual effects and evaluation metrics,compared with the composite intensity image I,polarization antagonistic image Sd,Sdd,Sh,and Sv,the average gradient,information entropy,spatial frequency,and mean gray value of the image are increased by at least 22.16%,9.23%,23.44%and 38.71%,respectively.
凤瑞;袁宏武;周玉叶;王峰
安徽建筑大学 电子与信息工程学院,安徽 合肥 230601安徽新华学院 大数据与人工智能学院,安徽 合肥 230088||偏振光成像技术安徽省重点实验室,安徽 合肥 230031偏振光成像技术安徽省重点实验室,安徽 合肥 230031
计算机与自动化
图像融合深度学习偏振图像
image fusiondeep learningpolarization image
《红外技术》 2024 (003)
288-294 / 7
国家自然科学基金资助项目(61906118);安徽省自然科学基金资助项目(2108085MF230);偏振光成像技术安徽省重点实验室开放基金(KFJJ-2020-2)
评论