|国家科技期刊平台
首页|期刊导航|计算机工程|基于多表征学习的交叉熵集成图像分类方法

基于多表征学习的交叉熵集成图像分类方法OA北大核心CSTPCD

Cross-Entropy Ensemble Image Classification Method Based on Multi-Representation Learning

中文摘要英文摘要

交叉熵损失是分类任务中常见的损失函数,然而现有深度分类方法往往使用单模型的交叉熵设计,存在分类泛化能力低、鲁棒性差等问题.受到多视图表征学习的启发,提出一种深度集成的交叉熵损失方法,以提高模型的泛化能力和鲁棒性.通过构建多样化子网络,学习单一图像数据下多个深度视角的不同表征,最终通过集成化交叉熵设计,将图像数据的多视角表征进行集成分类.该方法可以充分利用多视角深度网络的多样化表征进行图像的鲁棒分类,即将多个视图的交叉熵损失统一到整体的集成空间中进行分类,从而提升传统单一模型交叉熵设计下的图像分类性能.在SVHN、CIFAR等图像数据集上的实验结果表明,相比于现有的MEAL、CEN等图像分类方法,该方法在识别准确率上获得了明显提升.

Cross-entropy is a common loss function used in classification tasks.However,existing deep classification methods often use the cross-entropy design of a single model,which leads to problems such as low generalization ability and poor robustness.Based on multi-representation learning,this study proposes a deep cross-entropy ensemble loss function to improve the generalization ability and robustness of deep networks.The proposed method learns the different representations of multiple depth perspectives under the same image data samples by constructing diversified sub-networks and finally classifies the multi-perspective representations of image data using a cross-entropy ensemble loss design.The method fully utilizes the diversified representations of multi-view deep networks for robust image classification.This means that the proposed cross-entropy losses from multiple views can be unified into an overall ensemble space for classification,thereby improving the image classification capabilities under the traditional single cross-entropy design.Experimental results on multiple image datasets such as SVHN and CIFAR show that the proposed method significantly improves the recognition accuracy compared with existing image classification methods such as MEAL and CEN.

曲坤;王震龙;刘志锋

江苏大学京江学院,江苏镇江 212013江苏大学计算机科学与通信工程学院,江苏镇江 212013

计算机与自动化

深层网络图像分类交叉熵损失多表征学习集成学习

deep networkimage classificationcross-entropy lossmulti-representation learningensemble learning

《计算机工程》 2024 (010)

322-333 / 12

扬州市产业前瞻与共性关键技术项目(SCY2023000087).

10.19678/j.issn.1000-3428.0068519

评论