摘要
Abstract
As multivariate time series data become increasingly prevalent across various industries,anomaly detection methods that can ensure the stable operation and security of systems have become crucial.Owing to the inherent complexity and dynamic nature of multivariate time series data,higher demands are placed on anomaly detection algorithms.To address the inefficiencies of existing anomaly detection methods in processing high-dimensional data with complex variable relations,this study proposes an anomaly detection algorithm for multivariate time series data,based on Graph Neural Networks(GNNs)and a diffusion model,named GRD.By leveraging node embedding and graph structure learning,GRD algorithm proficiently captures the relations between variables and refines features through a Gated Recurrent Unit(GRU)and a Denoising Diffusion Probabilistic Model(DDPM),thereby facilitating precise anomaly detection.Traditional assessment methods often employ a Point-Adjustment(PA)protocol that involves pre-scoring,substantially overestimating an algorithm's capability.To reflect model performance realistically,this work adopts a new evaluation protocol along with new metrics.The GRD algorithm demonstrates F1@k scores of 0.741 4,0.801 7,and 0.767 1 on three public datasets.These results indicate that GRD algorithm consistently outperforms existing methods,with notable advantages in the processing of high-dimensional data,thereby underscoring its practicality and robustness in real-world anomaly detection applications.关键词
多变量时序数据/异常检测/图神经网络/扩散模型/评估协议Key words
multivariate time series data/anomaly detection/Graph Neural Network(GNN)/diffusion model/evaluation protocol分类
信息技术与安全科学