Back to Search Start Over

MultiResFormer: Transformer with Adaptive Multi-Resolution Modeling for General Time Series Forecasting

Authors :
Du, Linfeng
Xin, Ji
Labach, Alex
Zuberi, Saba
Volkovs, Maksims
Krishnan, Rahul G.
Du, Linfeng
Xin, Ji
Labach, Alex
Zuberi, Saba
Volkovs, Maksims
Krishnan, Rahul G.
Publication Year :
2023

Abstract

Transformer-based models have greatly pushed the boundaries of time series forecasting recently. Existing methods typically encode time series data into $\textit{patches}$ using one or a fixed set of patch lengths. This, however, could result in a lack of ability to capture the variety of intricate temporal dependencies present in real-world multi-periodic time series. In this paper, we propose MultiResFormer, which dynamically models temporal variations by adaptively choosing optimal patch lengths. Concretely, at the beginning of each layer, time series data is encoded into several parallel branches, each using a detected periodicity, before going through the transformer encoder block. We conduct extensive evaluations on long- and short-term forecasting datasets comparing MultiResFormer with state-of-the-art baselines. MultiResFormer outperforms patch-based Transformer baselines on long-term forecasting tasks and also consistently outperforms CNN baselines by a large margin, while using much fewer parameters than these baselines.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438505594
Document Type :
Electronic Resource