西安电子科技大学学报(自然科学版)2025,Vol.52Issue(2):1-12,12.DOI:10.19665/j.issn1001-2400.20241201
针对深度神经网络的高效光学对抗攻击
Effective adversarial optical attacks on deep neural networks
摘要
Abstract
With the continuous advancement of adversarial attack algorithms,the security risks that deep neural networks face are increasingly severe.Optical phenomena frequently occur in real-world scenarios,and the robustness against optical adversarial attacks directly reflects the safety of deep neural networks.Nevertheless,current research on optical adversarial attacks commonly encounters challenges such as optical perturbation distortion and optimization instability.To solve this problem,this paper proposes a novel optical attack method named AdvFlare,to help explore the effect of flare perturbations on the safety of deep neural networks.AdvFlare constructs a parameterized flare simulation model,which models the multiple attributes of the flare pattern,such as shape and color,with great simulation,on the basis of which this paper addresses the problems of adversarial perturbation distortion and convergence difficulties through strategies such as parameter space constraints,random initialization,and stepwise optimization.Experimental results indicate that AdvFlare can induce misclassification in deep neural networks with a significantly higher success rate compared to existing methods,while also offering a superior visual perturbation quality and stability.Furthermore,it is discovered that adversarial training using AdvFlare can markedly enhance the robustness of deep neural networks,in both the digital and physical world,providing valuable insights for improving model robustness in public transportation contexts.关键词
深度神经网络/对抗攻击/眩光效应/模型鲁棒性/对抗训练Key words
deep neural networks/adversarial attacks/flare effect/model robustness/adversarial training分类
计算机与自动化引用本文复制引用
戚富琪,高海昌,李博凌,邹翔..针对深度神经网络的高效光学对抗攻击[J].西安电子科技大学学报(自然科学版),2025,52(2):1-12,12.基金项目
国家自然科学基金(62302371) (62302371)