Back to Search Start Over

Self-attention-based time-variant neural networks for multi-step time series forecasting.

Authors :
Gao, Changxia
Zhang, Ning
Li, Youru
Bian, Feng
Wan, Huaiyu
Source :
Neural Computing & Applications. Jun2022, Vol. 34 Issue 11, p8737-8754. 18p.
Publication Year :
2022

Abstract

Time series forecasting is ubiquitous in various scientific and industrial domains. Powered by recurrent and convolutional and self-attention mechanism, deep learning exhibits high efficacy in time series forecasting. However, the existing forecasting methods are suffering some limitations. For example, recurrent neural networks are limited by the gradient vanishing problem, convolutional neural networks cost more parameters, and self-attention has a defect in capturing local dependencies. What's more, they all rely on time invariant or stationary since they leverage parameter sharing by repeating a set of fixed architectures with fixed parameters over time or space. To address the above issues, in this paper we propose a novel time-variant framework named Self-Attention-based Time-Variant Neural Networks (SATVNN), generally capable of capturing dynamic changes of time series on different scales more accurately with its time-variant structure and consisting of self-attention blocks that seek to better capture the dynamic changes of recent data, with the help of Gaussian distribution, Laplace distribution and a novel Cauchy distribution, respectively. SATVNN obviously outperforms the classical time series prediction methods and the state-of-the-art deep learning models on lots of widely used real-world datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
34
Issue :
11
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
156859349
Full Text :
https://doi.org/10.1007/s00521-021-06871-1