Back to Search Start Over

Novel hybrid multi-head self-attention and multifractal algorithm for non-stationary time series prediction.

Authors :
Yu, Xiang
Zhang, Dongmei
Zhu, Tianqing
Jiang, Xinwei
Source :
Information Sciences. Oct2022, Vol. 613, p541-555. 15p.
Publication Year :
2022

Abstract

Traditional time series prediction methods have shown their outstanding capabilities in time series prediction. However, due to essential differences in volatility characteristics among diverse types of non-stationary multivariate time series (NSMTS), it is difficult for traditional methods to maintain robust prediction performance. This study proposes a novel dynamic recurrent neural network to achieve stable and robust prediction performance. First, a multifractal gated recurrent unit (MF-GRU) based on the multifractal method is proposed to extract volatility characteristics. Meanwhile, to strengthen the parameters of the historical hidden layer state that has a more significant impact on the output, a self-attention mechanism is introduced into the MF-GRU, leading to a multifractal gated recurrent unit multi-head self-attention model. The efficiency of the proposed model was verified on public datasets. The experimental results show that the proposed model outperforms the traditional methods, such as long short-term memory (LSTM), the gated recurrent unit (GRU), and the minimal gated unit (MGU). etc. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
613
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
159928210
Full Text :
https://doi.org/10.1016/j.ins.2022.08.126