|国家科技期刊平台
首页|期刊导航|软件导刊|基于自注意力机制与双向LSTM的睡眠分期模型

基于自注意力机制与双向LSTM的睡眠分期模型OA

A Sleep Staging Model Based on Self-Attention Mechanism and Bi-Directional LSTM

中文摘要英文摘要

针对现有模型无法充分捕捉样本中短暂、随机的波形及无法聚焦典型、重要波形,从而影响分期结果的问题,提出一种基于自注意力机制与双向长短时记忆网络的睡眠分期模型.首先,构建单流时频信息学习模块自动表达PSG信号的低级表征,挖掘EEG数据的时不变信息、频率特征.然后,设计自适应特征重新校准学习模块,对30 s样本中出现的瞬时和重点波形特征进行校准训练,给予此类特征更多关注并分配更大权重.最后,将特征送至关联样本间的序列依赖学习模块,以学习各睡眠样本间的上下文关系,充分利用前后相邻样本判断当前样本类别.结果表明,该方法性能优于其他主流模型,在Sleep-edf-2013和Sleep-edf-2018两个公共睡眠数据集上的准确率分别为85.5%、84.3%,MF1值分别为82.1%、79.6%,可为睡眠分期任务提供技术参考.

A sleep staging model based on self attention mechanism and bidirectional long short-term memory network is proposed to address the problem that existing models cannot fully capture transient and random waveforms in samples,as well as cannot focus on typical and impor-tant waveforms,which affects staging results.Firstly,a single stream time-frequency information learning module is constructed to automati-cally express the low-level representations of PSG signals,and to mine the time-invariant information and frequency features of EEG data.Then,design an adaptive feature recalibration learning module to calibrate and train the instantaneous and key waveform features that appear in the 30 second sample,giving more attention to these features and assigning greater weights.Finally,the features are sent to the sequence dependency learning module between the associated samples to learn the contextual relationships between each sleep sample,fully utilizing ad-jacent samples before and after to determine the current sample category.The results show that this method performs better than other main-stream models,with accuracy rates of 85.5%and 84.3%on the Sleep-edf-2013 and Sleep-edf-2018 public sleep datasets,and MF1 values of 82.1%and 79.6%,respectively,which can provide technical reference for sleep staging tasks.

曹科研;王莹莹;陶杭波

沈阳建筑大学 计算机科学与工程学院,辽宁 沈阳 110168

计算机与自动化

睡眠分期Transformer双向LSTM脑电信号单通道

sleep stagingTransformerbi-directional LSTMelectroencephalogramsingle channel

《软件导刊》 2024 (005)

24-32 / 9

10.11907/rjdk.232271

评论