1. A Transformer based approach to electricity load forecasting.
- Author
-
Chan, Jun Wei and Yeo, Chai Kiat
- Subjects
- *
TRANSFORMER models , *RECURRENT neural networks , *DEEP learning , *CONVOLUTIONAL neural networks , *ELECTRICITY , *NATURAL language processing , *FORECASTING , *PHOTOVOLTAIC power generation - Abstract
In natural language processing (NLP), transformer based models have surpassed recurrent neural networks (RNN) as state of the art, being introduced specifically to address the limitations of RNNs originating from its sequential nature. As a similar sequence modeling problem, transformer methods can be readily adapted for deep learning time series prediction. This paper proposes a sparse transformer based approach for electricity load prediction. The layers of a transformer addresses the shortcomings of RNNs and CNNs by applying the attention mechanism on the entire time series, allowing any data point in the input to influence any location in the output of the layer. This allows transformers to incorporate information from the entire sequence in a single layer. Attention computations can also be parallelized. Thus, transformers can achieve faster speeds, or trade this speed for more layers and increased complexity. In experiments on public datasets, the sparse transformer attained comparable accuracy to an RNN-based SOTA method (Liu et al., 2022) while being up to 5× faster during inference. Moreover, the proposed model is general enough to forecast the load from individual households to city levels as shown in the extensive experiments conducted. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF