基于拆分注意力网络的单图像超分辨率重建OA北大核心CSTPCD
Single image super-resolution reconstruction based on split-attention networks
针对现有生成对抗网络的单图像超分辨率重建在大尺度因子下存在训练不稳定、特征提取不足和重建结果纹理细节严重缺失的问题,提出一种拆分注意力网络的单图超分辨率重建方法.首先,以拆分注意力残差模块作为基本残差块构造生成器,提高生成器特征提取的能力.其次,在损失函数中引入鲁棒性更好的Charbonnier损失函数和Focal Frequency Loss损失函数代替均方差损失函数,同时加入正则化损失平滑训练结果,防止图像过于像素化.最后,在生成器和判别器中采用谱归一化处理,提高网络的稳定性.在 4倍放大因子下,与其他方法在Set5、Set14、BSDS100、Urban100测试集上进行测试比较,本文方法的峰值信噪比比其他对比方法的平均值提升1.419 dB,结构相似性比其他对比方法的平均值提升0.051.实验数据和效果图表明,该方法主观上具有丰富的细节和更好的视觉效果,客观上具有较高的峰值信噪比值和结构相似度值.
A single image super-resolution reconstruction method for splitting attention networks is proposed to address the problems of lack of texture details,insufficient feature extraction,and unstable training in the existing generation of adversarial networks under large-scale factors.Firstly,the generator is constructed using the split attention residual module as the basic residual block,which improves the generator's feature extraction ability.Secondly,Charbonnier loss function with better robustness and focal frequency loss are introduced into the loss function to replace the mean square error loss function,and regularization loss smoothing training results are added to prevent the image from being too pixelated.Finally,spectral normalization is used in both the generator and discriminator to improve the stability of the network.Compared with other methods tested on Set5,Set14,Urban100 and BSDS100 test sets at a magnification factor of 4,the peak signal-to-noise ratio of this method is 1.419 dB higher than the average value of other comparison methods in this article,and the structural similarity is 0.051 higher than the average value.Experimental data and renderings indicate that this method subjectively has rich details and better visual effects,while objectively has high peak signal-to-noise ratio and structural similarity values.
彭晏飞;刘蓝兮;王刚;孟欣;李泳欣
辽宁工程技术大学 电子与信息工程学院,辽宁 葫芦岛 125105渤海船舶职业学院,辽宁 葫芦岛 125105
计算机与自动化
超分辨率生成对抗网络谱归一化拆分注意力网络
super resolutiongenerative adversarial networkspectral normalizationsplit-attention networks
《液晶与显示》 2024 (007)
950-960 / 11
国家自然科学基金(No.61772249);辽宁省高等学校基本科研项目(No.LJKZ0358)Supported by National Natural Science Foundation of China(No.61772249);Basic Research Project of Colleges and Universities in Liaoning Province(No.LJKZ0358)
评论