测试科学与仪器2025,Vol.16Issue(1):66-74,9.DOI:10.62756/jmsi.1674-8042.2025007
MSFResNet:面向野生菌识别的多尺度特征融合的ResNeXt50模型
MSFResNet:A ResNeXt50 model based on multi-scale feature fusion for wild mushroom identification
摘要
Abstract
To solve the problems of redundant feature information,the insignificant difference in feature representation,and low recognition accuracy of the fine-grained image,based on the ResNeXt50 model,an MSFResNet network model is proposed by fusing multi-scale feature information.Firstly,a multi-scale feature extraction module is designed to obtain multi-scale information on feature images by using different scales of convolution kernels.Meanwhile,the channel attention mechanism is used to increase the global information acquisition of the network.Secondly,the feature images processed by the multi-scale feature extraction module are fused with the deep feature images through short links to guide the full learning of the network,thus reducing the loss of texture details of the deep network feature images,and improving network generalization ability and recognition accuracy.Finally,the validity of the MSFResNet model is verified using public datasets and applied to wild mushroom identification.Experimental results show that compared with ResNeXt50 network model,the accuracy of the MSFResNet model is improved by 6.01%on the FGVC-Aircraft common dataset.It achieves 99.13%classification accuracy on the wild mushroom dataset,which is 0.47%higher than ResNeXt50.Furthermore,the experimental results of the thermal map show that the MSFResNet model significantly reduces the interference of background information,making the network focus on the location of the main body of wild mushroom,which can effectively improve the accuracy of wild mushroom identification.关键词
多尺度特征融合/注意力机制/ResNeXt50/野生菌识别/深度学习Key words
multi-scale feature fusion/attention mechanism/ResNeXt50/wild mushroom identification/deep learning引用本文复制引用
杨阳,巨涛,杨文杰,赵宇阳..MSFResNet:面向野生菌识别的多尺度特征融合的ResNeXt50模型[J].测试科学与仪器,2025,16(1):66-74,9.基金项目
This work was supported by National Natural Science Foundation of China(No.61862037),and Lanzhou Jiaotong University Tianyou Innovation Team Project(No.TY202002). (No.61862037)