Back to Search Start Over

A Novel Time-Series Memory Auto-Encoder With Sequentially Updated Reconstructions for Remaining Useful Life Prediction.

Authors :
Fu, Song
Zhong, Shisheng
Lin, Lin
Zhao, Minghang
Source :
IEEE Transactions on Neural Networks & Learning Systems. Dec2022, Vol. 33 Issue 12, p7114-7125. 12p.
Publication Year :
2022

Abstract

One of the significant tasks in remaining useful life (RUL) prediction is to find a good health indicator (HI) that can effectively represent the degradation process of a system. However, it is difficult for traditional data-driven methods to construct accurate HIs due to their incomprehensive consideration of temporal dependencies within the monitoring data, especially for aeroengines working under nonstationary operating conditions (OCs). Aiming at this problem, this article develops a novel unsupervised deep neural network, the so-called times series memory auto-encoder with sequentially updated reconstructions (SUR-TSMAE) to improve the accuracy of extracted HIs, which directly takes the multidimensional time series as input to simultaneously achieve feature extraction from both feature-dimension and time-dimension. Further, to make full use of the temporal dependencies, a novel long-short time memory with sequentially updated reconstructions (SUR-LSTM), which uses the errors not only from the current memory cell but also from subsequent memory cells to update the output layer’s weight of the current memory cell, is developed to act as the reconstructed layer in the SUR-TSMAE. The use of SUR-LSTM can help the SUR-TSMAE rapidly reconstruct the input time series with higher precision. Experimental results on a public dataset demonstrate the outstanding performance of SUR-TSMAE in comparison with some existing methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
33
Issue :
12
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
160690315
Full Text :
https://doi.org/10.1109/TNNLS.2021.3084249