基于维度感知注意力的无监督图像拼接网络OA
Unsupervised image stitching network based on dimension-aware attention
为改善无监督图像拼接图像常存在的结构变形和错位问题,提出了一种基于维度感知注意力的无监督图像拼接网络(dimension-aware images stitching network,DAISNet).该网络由单应性估计和重建2个子网络构成,重建子网络又由低分辨率优化分支和高分辨率双路分支组成.引入空洞空间金字塔池化模块和维度感知注意力模块构建低分辨率优化分支,增强对结构特征和拼接边界等关键区域的感知能力;借鉴异构架构思想,通过添加下层子网络构建高分辨率双路分支,提取更多的互补结构信息以改善拼接图像局部细节.实验结果表明:与UDIS等先进图像拼接方法相比,提出的DAISNet方法在UDIS-D数据集上有效改善了拼接图像中的结构变形和错位现象,结构相似性提高了0.63%以上,峰值信噪比提高了0.30%以上.
To address the common issues of structural deformation and misalignment in unsuper-vised image stitching,a dimension-aware images stitching network(DAISNet)based on dimen-sion aware attention was proposed.This network consists of two sub networks:homography esti-mation and reconstruction.The reconstruction sub network was further composed of two bran-ches:a low-resolution optimization branch and a high-resolution dual channel branch.We intro-duced the hollow space pyramid pooling module and dimension aware attention module to con-struct a low-resolution optimization branch,enhancing the perception ability of key areas such as structural features and stitching boundaries.Drawing on the idea of heterogeneous architecture,a high-resolution dual branch was constructed by adding lower-level subnetworks to extract more complementary structural information and improve local details in stitched images.The experi-mental results show that compared with advanced image stitching methods such as UDIS,the proposed DAISNet method effectively improves the structural deformation and misalignment phe-nomena in stitched images on the UDIS-D dataset,increases structural similarity by more than 0.63%,and improves peak signal-to-noise ratio by more than 0.30%.
潘杨;王白阳;朱磊;王慧栋;李雪
西安工程大学 电子信息学院,陕西 西安 710048西安工程大学 电子信息学院,陕西 西安 710048西安工程大学 电子信息学院,陕西 西安 710048西安工程大学 电子信息学院,陕西 西安 710048西安工程大学 电子信息学院,陕西 西安 710048
计算机与自动化
图像拼接单应性估计维度感知注意力低分辨率优化分支高分辨率双路分支
image stitchinghomography estimationdimension-aware attentionlow-resolution optimized branchhigh-resolution two-way branch
《西安工程大学学报》 2025 (2)
93-101,9
国家自然科学基金(61971339)
评论