计算机工程2025,Vol.51Issue(2):387-396,10.DOI:10.19678/j.issn.1000-3428.0069116
基于多尺度注意力和数据增强的细胞核分割
Nucleus Segmentation Based on Multiscale Attention and Data Augmentation
摘要
Abstract
The U-Net has been widely used in medical segmentation because of its simplicity and efficiency.However,the skip connections in the U-Net do not adequately bridge the semantic gap between the encoder and decoder.In addition,the strict labeling requirements for medical segmentation data reduce the number and scale of available datasets.To address the aforementioned issues,a Multi-Scale Attention Fusion(MSAF)module is designed to effectively alleviate semantic bias by utilizing the ability of the attention mechanism to adjust the learning direction of the network and incorporate multi-scale feature fusion.In the first two stages of the U-Net,channel attention is employed in MSAF to capture global features.In the next two stages,spatial attention is used to capture local features.Finally,the features extracted from the multiple stages are fused to enhance feature information.Moreover,Fourier Transform Data Augmentation(FTDA),a data augmentation method based on Fourier transform,is introduced to overcome the scarcity of medical segmentation data.FTDA enhances phase data by disturbing the amplitude data of the input image in the frequency domain.Experimental results on the MoNuSeg,CryoNuSeg,and 2018 Data Science Bowl datasets show that the mean Intersection over Union(mIoU)and Dice metrics of the proposed method are better than those of other advanced methods.Furthermore,the proposed FTDA method displays remarkable performance gains,even on small-scale datasets.关键词
注意力机制/U-Net模型/傅里叶变换/细胞核分割/数据增强Key words
attention mechanism/U-Net model/Fourier transform/nucleus segmentation/data augmentation分类
信息技术与安全科学引用本文复制引用
张兴鹏,何东,杨模,叶杭滨..基于多尺度注意力和数据增强的细胞核分割[J].计算机工程,2025,51(2):387-396,10.基金项目
西南石油大学自然科学"启航计划"项目(2022QHZ023,2022QHZ013) (2022QHZ023,2022QHZ013)
四川省科技创新人才基金(2022JDRC0009) (2022JDRC0009)
四川省自然科学基金(2022NSFSC0283) (2022NSFSC0283)
四川省科技厅重点研发项目(2023YFG0129). (2023YFG0129)