|国家科技期刊平台
首页|期刊导航|南京航空航天大学学报|基于注意力机制的火箭涡轮泵支承刚度辨识

基于注意力机制的火箭涡轮泵支承刚度辨识OA北大核心CSTPCD

An Attention Mechanism-Based Support Stiffness Prediction for Rocket Turbopumps

中文摘要英文摘要

作为重要的动力学参数,刚度辨识及预测对于涡轮泵动力特性具有关键意义,为此提出一种融合注意力机制和双向长短期记忆(Bi-directional long short-term memory,BiLSTM)网络的预测模型.将动力学响应融合输入,使用LSTM神经网络有效挖掘时序相关的历史特征.再将两层LSTM网络反向叠加组成BiLSTM模型,适应动力学信息复杂、序列冗长特点,深入挖掘参数间的非线性特征.随后引入Attention层,利用注意力机制获取特征分配权重,增强关键信息.最后通过某型涡轮泵的动力学数据训练辨识模型.结果表明,对于涡轮泵刚度特性,Attention-BiLSTM模型在序列数据处理方面具有显著优势,预测平均绝对百分比误差(Mean absolute percentage error,MAPE)可达 2.194 5%.而单一结构的RNN、LSTM和BiLSTM模型的预测MAPE分别为10.497 7%、5.497 3%和2.798 6%.可见该方法有效避免了复杂的动力学反问题求解,实现了非线性参数的动态识别.

As an important dynamics parameter,stiffness has a key role in turbopump vibration reduction.Therefore,a prediction model that incorporates attention mechanism and bi-directional long short-term memory(BiLSTM)neural network is proposed.Vibration information is input and the time-related historical features are effectively extracted by the LSTM network.Subsequently,the BiLSTM network is built by the inverse superposition of the two-layer LSTM network.This is to accommodate the complex and lengthy sequences of dynamics information,and thus the nonlinear features between parameters are extracted.The weights of the features are obtained by introducing the Attention layer,which will enhance the key information.Finally,the prediction model is trained with turbopump dynamics data.The results show that for turbopump stiffness characteristics,the Attention-BiLSTM model has a significant advantage in sequence data processing,with a mean absolute percentage error(MAPE)of 2.194 5%.In contrast,the MAPEs of RNN,LSTM,and BiLSTM models are 10.497 7%,5.497 3%,and 2.798 6%,respectively.It can be seen that the method effectively avoids the complex dynamical inverse problem solving and achieves the dynamic identification of nonlinear parameters.

苏越;许开富;金路;王伟;侯理臻

西北工业大学动力与能源学院,西安 710072中国航天科技集团西安航天动力研究所,西安 710100

液体火箭发动机涡轮泵支承刚度长短期记忆网络注意力机制

liquid rocket engineturbopumpsupport stiffnesslong short-term memory networkattention mechanism

《南京航空航天大学学报》 2024 (004)

639-649 / 11

中央高校基本科研业务费(D5000210486).

10.16356/j.1005-2615.2024.04.006

评论