Back to Search
Start Over
DRCNN: decomposing residual convolutional neural networks for time series forecasting.
- Source :
-
Scientific reports [Sci Rep] 2023 Sep 23; Vol. 13 (1), pp. 15901. Date of Electronic Publication: 2023 Sep 23. - Publication Year :
- 2023
-
Abstract
- Recent studies have shown great performance of Transformer-based models in long-term time series forecasting due to their ability in capturing long-term dependencies. However, Transformers have their limitations when training on small datasets because of their lack in necessary inductive bias for time series forecasting, and do not show significant benefits in short-time step forecasting as well as that in long-time step as the continuity of sequence is not focused on. In this paper, efficient designs in Transformers are reviewed and a design of decomposing residual convolution neural networks or DRCNN is proposed. The DRCNN method allows to utilize the continuity between data by decomposing data into residual and trend terms which are processed by a designed convolution block or DR-Block. DR-Block has its strength in extracting features by following the structural design of Transformers. In addition, by imitating the multi-head in Transformers, a Multi-head Sequence method is proposed such that the network is enabled to receive longer inputs and more accurate forecasts are obtained. The state-of-the-art performance of the presented model are demonstrated on several datasets.<br /> (© 2023. Springer Nature Limited.)
Details
- Language :
- English
- ISSN :
- 2045-2322
- Volume :
- 13
- Issue :
- 1
- Database :
- MEDLINE
- Journal :
- Scientific reports
- Publication Type :
- Academic Journal
- Accession number :
- 37741848
- Full Text :
- https://doi.org/10.1038/s41598-023-42815-6