Back to Search Start Over

DRCNN: decomposing residual convolutional neural networks for time series forecasting.

Authors :
Zhu, Yuzhen
Luo, Shaojie
Huang, Di
Zheng, Weiyan
Su, Fang
Hou, Beiping
Source :
Scientific Reports. 11/18/2023, Vol. 13 Issue 1, p1-12. 12p.
Publication Year :
2023

Abstract

Recent studies have shown great performance of Transformer-based models in long-term time series forecasting due to their ability in capturing long-term dependencies. However, Transformers have their limitations when training on small datasets because of their lack in necessary inductive bias for time series forecasting, and do not show significant benefits in short-time step forecasting as well as that in long-time step as the continuity of sequence is not focused on. In this paper, efficient designs in Transformers are reviewed and a design of decomposing residual convolution neural networks or DRCNN is proposed. The DRCNN method allows to utilize the continuity between data by decomposing data into residual and trend terms which are processed by a designed convolution block or DR-Block. DR-Block has its strength in extracting features by following the structural design of Transformers. In addition, by imitating the multi-head in Transformers, a Multi-head Sequence method is proposed such that the network is enabled to receive longer inputs and more accurate forecasts are obtained. The state-of-the-art performance of the presented model are demonstrated on several datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20452322
Volume :
13
Issue :
1
Database :
Academic Search Index
Journal :
Scientific Reports
Publication Type :
Academic Journal
Accession number :
173738377
Full Text :
https://doi.org/10.1038/s41598-023-42815-6