|国家科技期刊开放平台
首页|期刊导航|南京大学学报(自然科学版)|基于乳腺超声视频流和自监督对比学习的肿瘤良恶性分类系统

基于乳腺超声视频流和自监督对比学习的肿瘤良恶性分类系统OA北大核心CSTPCD

Breast tumor classification based on video stream and self-supervised contrastive learning

中文摘要英文摘要

乳腺超声广泛应用于乳腺肿瘤诊断,基于深度学习的肿瘤良恶性分类模型可以有效地辅助医生诊断,提高效率,降低误诊率,然而,由于标注数据的高成本问题,限制了此类模型的开发和应用.为此,从乳腺超声视频中构建了无标注预训练数据集,包含11805个目标样本数据和动态生成的正、负样本数据集(样本量分别为188880和1310355个).基于该数据集,搭建了三胞胎网络并进行了自监督对比学习.此外,还发展了 Hard Negative Mining和Hard Positive Mining方法来选取困难的正负样本构建对比损失函数,加快模型收敛.参数迁移后,将三胞胎网络在SYU数据集上进行微调和测试.实验结果表明,与基于ImageNet预训练的若干SOTA模型以及与前人针对乳腺超声的多视图对比模型相比,提出的三胞胎网络模型具有更强的泛化能力和更好的分类性能.此外,还测试了模型对标注数据量的需求下限,发现仅使用96个标注数据,模型性能即可达到AUC=0.901,敏感度为0.835.

Abstact:Breast ultrasound is widely used in the diagnosis of breast tumors.Deep learning-based tumor benign-malignant classification models effectively assist doctors in diagnosis,improving efficiency and reducing misdiagnosis rates,among other benefits.However,the high cost of annotated data limits the development and application of such models.In this study,we construct an unlabeled pretraining dataset from breast ultrasound videos,which includes 11805 target samples and dynamical-ly generated positive and negative sample datasets(with sample sizes of 188880 and 1310355,respectively).Based on this dataset,we build a triplet network and conduct self-supervised contrastive learning.Additionally,we develope Hard Negative Mining and Hard Positive Mining methods to select challenging positive and negative samples for constructing the contrastive loss function,accelerating model convergence.After parameter transfer,the triplet network is fine-tuned and tested on the SYU dataset.Experimental results demonstrate that the triplet network model developed in this study exhibits stronger gen-eralization capability and better classification performance compared to several state-of-the-art models pretrained on ImageNet and previous multi-view contrastive models for breast ultrasound.Furthermore,we test the minimum requirement of annotat-ed data for the model and find that using only 96 annotated data points achieves a performance with an AUC=0.901 and sen-sitivity of 0.835.

唐蕴芯;廖梅;张艳玲;张建;陈皓;王炜

南京大学物理学院,南京,210093中山大学附属第三医院超声科,广州,510630南京大学物理学院,南京,210093||南京大学脑科学研究院,南京,210093杭州精康科技,杭州,310000

计算机与自动化

乳腺超声深度学习自监督学习对比学习预训练模型三胞胎网络

breast ultrasounddeep learningself-supervised learningcontrastive learningpre-trained modelTriplet Network

《南京大学学报(自然科学版)》 2024 (001)

26-37 / 12

国家自然科学基金(11774158)

10.13232/j.cnki.jnju.2024.01.004

评论