|国家科技期刊平台
首页|期刊导航|红外技术|基于空间自适应和内容感知的红外小目标检测

基于空间自适应和内容感知的红外小目标检测OA北大核心CSTPCD

Spatially Adaptive and Content-Aware Infrared Small Target Detection

中文摘要英文摘要

由于红外街道图像中小目标像素较少、颜色特征不丰富,容易导致模型漏检、误检以及检测效果不佳等问题,因此提出了一种基于空间自适应和内容感知的红外小目标检测算法.首先,通过堆叠局部注意力与可变形注意力设计一种基于空间自适应的转换器,以增强对长距离依赖特征的建模能力,捕获到更多空间位置信息.其次,采用内容感知特征重组算子进行特征上采样,实现在大感受野内聚合上下文信息以及利用浅层特征信息来自适应地重组特征.最后增加 160×160 的高分辨率预测头,将输入特征的像素点映射到更细小的检测区域,进一步改善小目标的检测效果.在FILR数据集上的实验结果表明,改进算法的平均精度均值达到 85.6%,相较于YOLOX-s算法提高了 3.9%,验证了所提算法在红外小目标检测上的优越性.

Owing to the scarcity of pixel values and limited color features in infrared street images,issues such as missed detections,false detections,and poor detection performance are common.To address these problems,a spatially adaptive and content-aware infrared small object detection algorithm is proposed.The key components of this algorithm are as follows.1)Spatially adaptive transformer:This transformer is designed by stacking local attention and deformable attention mechanisms to enhance the modeling capability of long-range dependency features and capture more spatial positional information.2)Content-aware reassembly of features(CARAFE)operator:This operator is used for feature upsampling,aggregating contextual information within a large receptive field,and adaptively recombining features using shallow-level information.3)High-resolution prediction head:A high-resolution prediction head of size 160x160 is added to map the pixels of input features to finer detection regions,further improving the detection performance of small objects.Experimental results on the FLIR dataset demonstrate that the proposed algorithm achieves an average precision mean of 85.6%,representing a 3.9%improvement over the YOLOX-s algorithm.These results validate the superiority of the proposed algorithm in detecting small objects in infrared images.

闵锋;刘彪;况永刚;毛一新;刘煜晖

武汉工程大学 计算机科学与工程学院 智能机器人湖北省重点实验室,湖北 武汉 430205

计算机与自动化

空间自适应内容感知红外目标重组特征高分辨率预测头

spatially adaptivecontent awareinfrared targerestructuring featureshigh resolution prediction head

《红外技术》 2024 (007)

735-742 / 8

国家自然科学基金(62171328).

评论