计算机技术与发展2024,Vol.34Issue(4):101-108,8.DOI:10.20165/j.cnki.ISSN1673-629X.2024.0016
基于DTA-FSAF的无人机小目标检测研究
Research on Small Object Detection of UAV Based on DTA-FSAF
摘要
Abstract
With the increasing application of UAV,the demand for object detection in traffic scenes based on UAV is also increasing.However,existing algorithms have low detection accuracy and insufficient robustness from the perspective of UAV.In order to effectively solve the object detection problem of vehicles and pedestrians from the perspective of UAV in traffic scenes,we propose the DTA-FSAF network for object detection.Firstly,deformable convolution is integrated into the backbone network ResNet-50 to improve the feature learning ability of the FSAF(Feature Selective Anchor-Free)network,and PAFPN(Path Aggregation Feature Pyramid Network)is used for multi-scale fusion to improve the detection accuracy of small object and the fitting ability of the network.Secondly,task alignment detection heads are used to reduce the misalignment of classification and positioning tasks in detecting small object,thus further improving the robustness of the network.Finally,the IoU loss is adjusted to improve the overall detection performance of the network.Through experiments and analysis on the drone dataset VisDrone,it is known that compared with other networks,the DTA-FSAF network can achieve a detection accuracy of 41.3%in different traffic scenes while meeting real-time requirements.This is a 19.6%improvement over the FSAF network.The experimental results demonstrate that the improved algorithm can effectively complete the object detection of pedestrians and vehicles in various complex traffic scenes.关键词
目标检测/小目标检测/Feature Selective Anchor-Free/无人机/标签分配Key words
object detection/tiny object detection/Feature Selective Anchor-Free/UAV/label assignment分类
信息技术与安全科学引用本文复制引用
赵侃,汪慧兰,郭娇娇,王桂丽..基于DTA-FSAF的无人机小目标检测研究[J].计算机技术与发展,2024,34(4):101-108,8.基金项目
安徽省自然科学基金(1708085QF133) (1708085QF133)
安徽师范大学创新基金项目(2018XJJ100) (2018XJJ100)
安徽省智能机器人信息融合与控制工程实验室资助(IFCIR2020004) (IFCIR2020004)