101. Triple-Stage Attention-Based Multiple Parallel Connection Hybrid Neural Network Model for Conditional Time Series Forecasting
- Author
-
Yasuhiko Morimoto and Yepeng Cheng
- Subjects
General Computer Science ,Computer science ,business.industry ,Deep learning ,Feature extraction ,General Engineering ,Encoder-decoder ,01 natural sciences ,Convolutional neural network ,010305 fluids & plasmas ,Hybrid neural network ,Feature Dimension ,Recurrent neural network ,time series prediction ,0103 physical sciences ,Feature (machine learning) ,General Materials Science ,Artificial intelligence ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,010306 general physics ,business ,triple-stage attention ,hybrid neural networks ,lcsh:TK1-9971 - Abstract
The attention-based SeriesNet (A-SeriesNet) combined augmented attention residual learning module-based convolutional neural network (augmented ARLM-CNN) subnetwork with hidden state attention module-based recurrent neural network (HSAM-RNN) subnetwork for conditional time series prediction with high accuracy. The augmented ARLM-CNN subnetwork has defects in extracting latent features of the multi-condition series. The forecasting accuracy will decrease when the feature dimension of the multi-condition series becomes high. The same problem also occurs in the HSAM-RNN subnetwork of A-SeriesNet. The dual-stage attention recurrent neural network (DA-RNN) proved that the attention-based encoder-decoder framework is an effective model for dealing with the above problem. This paper applies the DA-RNN to the HSAM-RNN subnetwork of A-SeriesNet and presents the triple-stage attention-based recurrent neural network (TA-RNN) subnetworks. Furthermore, this paper considers a CNN-based encoder-decoder structure named dual attention residual learning module-based convolutional neural network (DARLM-CNN) subnetwork to improve the augmented ARLM-CNN subnetwork of A-SeriesNet. Finally, this paper presents the triple-stage attention-based SeriesNet (TA-SeriesNet), which uses a new concatenation method instead of the element-wise multiplication of A-SeriesNet to parallel connect the proposed subnetworks and reduce the dependence of forecasting results on a certain subnetwork. The experimental results show our TA-SeriesNet is superior to other deep learning models in forecasting accuracy evaluation metrics for high feature dimensional time series datasets.
- Published
- 2021