计算机工程与应用2024,Vol.60Issue(10):61-75,15.DOI:10.3778/j.issn.1002-8331.2310-0362
针对目标检测模型的物理对抗攻击综述
Survey of Physical Adversarial Attacks Against Object Detection Models
摘要
Abstract
Deep learning models are highly susceptible to adversarial samples,and even minuscule image perturbations that are not perceptible to the naked eye can disable well-trained deep learning models.Recent research indicates that these perturbations can exist in the physical world.This paper provides insight into physical adversarial attacks on deep learning object detection models,clarifying the concept of physical adversarial attack and outlining the general process of such attacks on object detection.According to the different attack tasks,a series of physical adversarial attack methods against object detection networks in recent years are reviewed from vehicle detection and pedestrian detection.Other attacks against target detection models,other attack tasks and other attack methods are briefly introduced.The current challenges of physical adversarial attack are discussed,the limitations of adversarial training are leaded out,and future development directions and application prospect are suggested.关键词
对抗攻击/物理攻击/深度学习/深度神经网络Key words
adversarial attack/physical attack/deep learning/deep neural network分类
信息技术与安全科学引用本文复制引用
蔡伟,狄星雨,蒋昕昊,王鑫,高蔚洁..针对目标检测模型的物理对抗攻击综述[J].计算机工程与应用,2024,60(10):61-75,15.基金项目
国家部委基金. ()