Back to Search Start Over

CTF-former: A novel simplified multi-task learning strategy for simultaneous multivariate chaotic time series prediction.

Authors :
Fu K
Li H
Shi X
Source :
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2024 Jun; Vol. 174, pp. 106234. Date of Electronic Publication: 2024 Mar 14.
Publication Year :
2024

Abstract

Multivariate chaotic time series prediction is a challenging task, especially when multiple variables are predicted simultaneously. For multiple related prediction tasks typically require multiple models, however, multiple models are difficult to keep synchronization, making immediate communication between predicted values challenging. Although multi-task learning can be applied to this problem, the principles of allocation and layout options between shared and specific representations are ambiguous. To address this issue, a novel simplified multi-task learning method was proposed for the precise implementation of simultaneous multiple chaotic time series prediction tasks. The scheme proposed consists of a cross-convolution operator designed to capture variable correlations and sequence correlations, and an attention module proposed to capture the information embedded in the sequence structure. In the attention module, a non-linear transformation was implemented with convolution, and its local receptive field and the global dependency of the attention mechanism achieve complementarity. In addition, an attention weight calculation was devised that takes into account not only the synergy of time and frequency domain features, but also the fusion of series and channel information. Notably the scheme proposed a purely simplified design principle of multi-task learning by reducing the specific network to single neuron. The precision of the proposed solution and its potential for engineering applications were verified with the Lorenz system and power consumption. The mean absolute error of the proposed method was reduced by an average of 82.9% in the Lorenz system and 19.83% in power consumption compared to the Gated Recurrent Unit.<br />Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.<br /> (Copyright © 2024 Elsevier Ltd. All rights reserved.)

Details

Language :
English
ISSN :
1879-2782
Volume :
174
Database :
MEDLINE
Journal :
Neural networks : the official journal of the International Neural Network Society
Publication Type :
Academic Journal
Accession number :
38521015
Full Text :
https://doi.org/10.1016/j.neunet.2024.106234