计算机科学与探索2024,Vol.18Issue(3):805-817,13.DOI:10.3778/j.issn.1673-9418.2304045
融合SENet和Transformer的应用层协议识别方法
Application Layer Protocol Recognition Incorporating SENet and Transformer
摘要
Abstract
Protocol recognition technology assumes a crucial position and exerts significant influence in the do-mains of network communication and information security.Existing protocol recognition methods based on spatio-temporal features cannot adequately and comprehensively extract protocol features.An application layer protocol recognition method incorporating SENet channel attention and Transformer is proposed.The model focuses on spatio-temporal feature extraction of protocol data,and the model consists of a spatial feature extraction module and a time extraction module.SE blocks are added to the residual network to capture the associations between multiple chan-nels and adaptively assign weights,so as to extract the key space features in different channels.The temporal feature extraction module is constructed by stacking the transformer encoders based on multi-head attention mechanism.This module is used to comprehensively capture temporal features of the protocol data by directly leveraging the positional information of the input data.After extracting and learning more detailed spatial features and more comprehensive temporal features,better protocol feature representation is obtained to improve protocol recognition performance.Experiments are conducted on the ISCX2012 and CSE_CIC_IDS2018 hybrid datasets,and the results show that the overall recognition accuracy of the proposed model reaches 99.20%,and the F1 score reaches 98.99%,which are higher than those of the comparison models.关键词
SENet/残差网络/自注意力/Transformer/协议识别/网络安全Key words
SENet/residual network/self-attention/Transformer/protocol recognition/network security分类
信息技术与安全科学引用本文复制引用
陈乾,洪征,司健鹏..融合SENet和Transformer的应用层协议识别方法[J].计算机科学与探索,2024,18(3):805-817,13.基金项目
国家重点研发计划(2019YFB2101704).This work was supported by the National Key Research and Development Program of China(2019YFB2101704). (2019YFB2101704)