|国家科技期刊平台
首页|期刊导航|红外技术|基于快速联合双边滤波器和改进PCNN的红外与可见光图像融合

基于快速联合双边滤波器和改进PCNN的红外与可见光图像融合OA北大核心CSTPCD

Infrared and Visible Image Fusion Based on Fast Joint Bilateral Filtering and Improved PCNN

中文摘要英文摘要

针对红外与可见光图像融合结果中细节丢失、目标不显著和对比度低等问题,提出了一种结合快速联合双边滤波器(fast joint bilateral filter,FJBF)和改进脉冲耦合神经网络(pulse coupled neural network,PCNN)的红外与可见光图像融合方法,在保证融合图像质量的前提下有效提高运行效率.首先,利用快速联合双边滤波器对源图像进行分解;其次,为了更好地提取图像中显著结构和目标信息,针对基础层图像采用一种基于视觉显著图(visual significance map,VSM)的加权平均融合规则,针对细节层图像采用改进脉冲耦合神经网络模型进行融合,其中PCNN的所有参数都可以根据输入波段自适应调节;最后,将基础层融合图与细节层融合图叠加重构得到融合图像.实验结果表明,该方法提高了融合图像的效果,有效地保留了目标、背景细节和边缘等重要信息.

To address the problems of detail loss,inconspicuous targets,and low contrast in infrared and visible image fusion,a fusion method combining fast joint bilateral filtering(FJBF)and an improved pulse-coupled neural network(PCNN)was proposed.The operational efficiency can be effectively improved by ensuring the quality of the fused image.First,the source images were decomposed by fast joint bilateral filtering.Second,to extract significant structure and target information,a weighted average fusion rule based on a visual saliency graph(VSM)was adopted for the basic layer image,and an improved pulse-coupled neural network model was adopted for the detail layer image.All parameters of the PCNN can be adjusted according to the input bands,and the fusion image was reconstructed using the superimposed fusion map of the base layer and the fusion map of the detail layer.The experimental results show that this method can significantly improve the image fusion effect and effectively retain important information,such as targets,background details,and edges.

杨艳春;雷慧云;杨万轩

兰州交通大学 电子与信息工程学院,甘肃 兰州 730070

计算机与自动化

图像处理快速联合双边滤波器脉冲耦合神经网络红外与可见光图像图像融合

image processingfast joint bilateral filterpulse coupled neural networkinfrared and visible imageimage fusion

《红外技术》 2024 (008)

892-901 / 10

长江学者和创新团队发展计划资助(IRT_16R36);国家自然科学基金(62067006);甘肃省科技计划项目(18JR3RA104);甘肃省高等学校产业支撑计划项目(2020C-19);兰州市科技计划项目(2019-4-49);甘肃省自然科学基金项目(23JRRA847、21JR7RA300);兰州交通大学—天津大学联合创新基金项目(2021052).

评论