中国科学院大学学报2025,Vol.42Issue(4):565-575,11.DOI:10.7523/j.ucas.2023.086
多重残差网络的多光谱遥感图像锐化方法
Multispectral remote sensing image pan-sharpening method based on multi-residual network
摘要
Abstract
This paper proposes a multi-spectral remote sensing image sharpening method based on a deep convolutional neural network and residual network.The method addresses the problems of spectral distortion in traditional remote sensing image sharpening methods and insufficient information utilization between network layers in current deep learning-based methods.The proposed method uses the depth convolution and residual network to design the depth residual module to extract the spatial and spectral features of the deep image.Additionally,residual connections between sub-blocks are established to transmit gradient information to deeper networks and avoid gradient explosion problems,making the network more efficient.Experiments are conducted on simulated and real-world multi-spectral images from WorldView-2,and the results are compared with traditional and existing deep learning-based methods.The proposed method improves the spectral distortion phenomenon and learns deeper image features to better preserve the spatial and spectral information of the image.The proposed method outperforms the deep convolutional sharpening network method in terms of various evaluation metrics,including ERGAS,SAM,SCC,UIQI,and the global fusion quality evaluation index.The proposed method improves these metrics by 24.4%,26.7%,6.2%,4.7%,and 6.3%,respectively.Subjective and objective evaluations and spectral curve also indicate that the proposed method significantly improves the spatial and spectral resolution of remote sensing images,especially under complex environmental conditions.关键词
遥感图像锐化方法/深度学习/多光谱遥感图像/卷积神经网络/残差网络Key words
remote sensing image pan-sharpening method/deep learning/multispectral remote sensing image/convolutional neural network/residual network分类
信息技术与安全科学引用本文复制引用
周庆泽,郭擎..多重残差网络的多光谱遥感图像锐化方法[J].中国科学院大学学报,2025,42(4):565-575,11.基金项目
国家自然科学基金(61771470)资助 (61771470)