计算机工程与应用2025,Vol.61Issue(15):111-123,13.DOI:10.3778/j.issn.1002-8331.2503-0097
LMFI-YOLO:复杂场景下的轻量化行人检测算法
LMFI-YOLO:Lightweight Pedestrian Detection Algorithm in Complex Scenes
摘要
Abstract
Aiming at the problems of false detection,missing detection and high model complexity in the current pedestrian detection algorithm in complex scenes,an improved YOLO11 lightweight pedestrian detection algorithm LMFI-YOLO is proposed.RepConv is integrated to improve the C3k2 module,and RS-C3k2 structure is constructed to enhance the learning and capturing ability of pedestrian features.A new neck structure MBFPN is designed,which combines efficient upsam-pling module and multi-scale convolution module to strengthen feature fusion and enhance the feature expression ability of pedestrians,and greatly improve the detection accuracy.TD-Detect,a task interaction detection header,is designed to significantly reduce the number of parameters and model size by sharing the convolution and task interaction mecha-nism.In order to further improve the detection accuracy,Focaler-GIoU is used as the bounding box regression loss function to improve the target positioning and overall performance by focusing different regression samples.The experimental results show that the proposed algorithm increases mAP50 by 8.5 percentage points on the CityPersons dataset,reduces the number of model parameters to 1.8×106,and compacts the model size to 4.1 MB.Generalization experiments on TinyPer-son and CrowdHuman datasets show that the mAP50 of the proposed algorithm can be improved by 6.0 and 4.0 percentage points in small-size targets and occlusion scenes,respectively.In summary,LMFI-YOLO significantly reduces the com-plexity of the model while significantly improving the detection accuracy.关键词
行人检测/小目标行人/遮挡行人/深度卷积/任务交互Key words
pedestrian detection/small target pedestrian/occluded pedestrians/depth-wise convolution/task interaction分类
信息技术与安全科学引用本文复制引用
袁婷婷,赖惠成,汤静雯,张晞,高古学..LMFI-YOLO:复杂场景下的轻量化行人检测算法[J].计算机工程与应用,2025,61(15):111-123,13.基金项目
新疆维吾尔自治区重点研发计划(2022B01008). (2022B01008)