Back to Search Start Over

A Novel Time-Series Memory Auto-Encoder With Sequentially Updated Reconstructions for Remaining Useful Life Prediction

Authors :
Lin Lin
Shisheng Zhong
Minghang Zhao
Song Fu
Source :
IEEE Transactions on Neural Networks and Learning Systems. 33:7114-7125
Publication Year :
2022
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2022.

Abstract

One of the significant tasks in remaining useful life (RUL) prediction is to find a good health indicator (HI) that can effectively represent the degradation process of a system. However, it is difficult for traditional data-driven methods to construct accurate HIs due to their incomprehensive consideration of temporal dependencies within the monitoring data, especially for aeroengines working under nonstationary operating conditions (OCs). Aiming at this problem, this article develops a novel unsupervised deep neural network, the so-called times series memory auto-encoder with sequentially updated reconstructions (SUR-TSMAE) to improve the accuracy of extracted HIs, which directly takes the multidimensional time series as input to simultaneously achieve feature extraction from both feature-dimension and time-dimension. Further, to make full use of the temporal dependencies, a novel long-short time memory with sequentially updated reconstructions (SUR-LSTM), which uses the errors not only from the current memory cell but also from subsequent memory cells to update the output layer's weight of the current memory cell, is developed to act as the reconstructed layer in the SUR-TSMAE. The use of SUR-LSTM can help the SUR-TSMAE rapidly reconstruct the input time series with higher precision. Experimental results on a public dataset demonstrate the outstanding performance of SUR-TSMAE in comparison with some existing methods.

Details

ISSN :
21622388 and 2162237X
Volume :
33
Database :
OpenAIRE
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Accession number :
edsair.doi.dedup.....8bb3947946d830070eedc1108c51b062