光学精密工程2025,Vol.33Issue(6):928-944,17.DOI:10.37188/OPE.20253306.0928
多级图特征融合引导相机位姿回归
Multi-level feature fusion for camera pose regression
摘要
Abstract
To improve the accuracy and stability of camera pose estimation in complex scenarios,this pa-per independently designed the ResGraphLoc network.This network further enhanced the pose regression accuracy of the camera in scenarios with occlusion,illumination changes,and low texture by introducing the residual network and the graph attention mechanism.The network adopted ResNet101 as the feature encoder and enhanced the significant feature extraction ability through the improved residual block.The graph attention layer was utilized to fuse multi-level feature maps and realized feature information diffusion and aggregation through the multi-head self-attention mechanism.Finally,the position and angle features were extracted from the feature embedding through the nonlinear MLP layer to complete the end-to-end camera pose regression.On the large-scale outdoor dataset,the pose error of the ResGraphLoc model was superior to the existing algorithms.In the LOOP and FULL scenarios,the pose regression results are 7.18 m,2.48° and 16.96 m,3.16° respectively,with an improvement of more than 25%compared to the benchmark model.In the 4Seasons dataset's Neighborhood scenario,the outdoor localization error can be as low as 1.40 m and 0.76°.In the indoor dataset with missing and repetitive textures,the position and an-gle regression results can reach 0.08m and 3.25° respectively.The experimental results verify the high ac-curacy and stability of ResGraphLoc in complex environments and can effectively cope with occlusion,illu-mination changes,and low texture scenarios.关键词
计算机视觉/相机位姿回归/相机定位/图注意力/多级特征融合Key words
computer vision/camera pose estimation/camera localization/graph attention/multi-level feature fusion分类
计算机与自动化引用本文复制引用
司钧文,周自维..多级图特征融合引导相机位姿回归[J].光学精密工程,2025,33(6):928-944,17.基金项目
国家自然科学基金资助项目(No.61575090) (No.61575090)
国家自然科学基金青年基金资助项目(No.61803189) (No.61803189)
辽宁省自然科学基金资助项目(No.2019-ZD-0031,No.2020FWDF13) (No.2019-ZD-0031,No.2020FWDF13)