| 注册
首页|期刊导航|计算机工程|基于自注意力机制的时间序列插补

基于自注意力机制的时间序列插补

徐磊 曾艳 袁俊峰 岳鲁鹏 殷昱煜 张纪林 薛梅婷 韩猛

计算机工程2025,Vol.51Issue(11):90-99,10.
计算机工程2025,Vol.51Issue(11):90-99,10.DOI:10.19678/j.issn.1000-3428.0069678

基于自注意力机制的时间序列插补

Time Series Imputation Based on Self-Attention Mechanism

徐磊 1曾艳 1袁俊峰 1岳鲁鹏 1殷昱煜 1张纪林 1薛梅婷 1韩猛1

作者信息

  • 1. 杭州电子科技大学计算机学院,浙江 杭州 310000
  • 折叠

摘要

Abstract

As core data for maritime traffic,ship trajectory data can be used for trajectory prediction,early warning,and other tasks with pronounced temporal characteristics.However,owing to factors such as harsh marine environments and poor communication reliability,missing ship trajectory data is a common problem.Learning from time series containing missing data can affect the accuracy of time series analysis significantly.The current mainstream solution is to approximate the imputation of missing data,mainly based on convolutional models,to reshape the time series along a timeline to capture its local features of the time series.However,the ability to capture the global features of long time series is limited.The Transformer enhances the ability of a model to capture the global features of a time series by capturing the relationships between various time points in the time series through its core self-attention mechanism.However,because its attention is calculated through matrix multiplication,it ignores the temporal nature of the time series,and the obtained global feature weights do not have a time span dependency.Therefore,to address the issue of capturing global features in long time series,this study proposes the GANet,a variant network based on the self-attention mechanism.GANet first obtains the basic global feature weight matrix from the time series points through the self-attention mechanism and then uses gated recurrent units to forget and update this global feature weight matrix on the timeline,thereby obtaining a global feature weight matrix with time span dependency,which is then used for data reconstruction to impute the missing data.GANet combines the self-attention mechanism and gating mechanism to capture global features while considering the impact of the time span on different time points,making the captured global feature time span dependent.Experimental results show that compared with existing models,such as Autoformer and Informer,GANet achieves better interpolation performance on Trajectory,ETT,and Electricity datasets.

关键词

自注意力机制/门控循环单元/全局特征捕捉/时间跨度依赖性/时间序列插补

Key words

self-attention mechanism/gated recurrent unit/global feature capture/time span dependency/time series imputation

分类

计算机与自动化

引用本文复制引用

徐磊,曾艳,袁俊峰,岳鲁鹏,殷昱煜,张纪林,薛梅婷,韩猛..基于自注意力机制的时间序列插补[J].计算机工程,2025,51(11):90-99,10.

基金项目

国家自然科学基金(62072146) (62072146)

浙江省"高层次人才特殊支持计划"科技创新领军型人才项目(2022R52043) (2022R52043)

浙江省重点研发计划(2023C03194,2021C03187). (2023C03194,2021C03187)

计算机工程

OA北大核心

1000-3428

访问量0
|
下载量0
段落导航相关论文