Back to Search
Start Over
Improving multi-site photovoltaic forecasting with relevance amplification: DeepFEDformer-based approach.
- Source :
-
Energy . Jul2024, Vol. 299, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- The present research on photovoltaic (PV) forecasting is devoted to the use of spatial information about PV sites to improve the accuracy of the models, but most of the models have to increase their network complexity to learn the spatial dependence. In this paper, we propose to take advantage of the known geographic location information of PV sites and embed them directly into the input information of Decoder, which makes it easier for the model to focus its attention. We refer to this process as relevance amplification. Based on this, this paper proposes Relevance Amplification based DeepFEDformer (RAD-FEDformer), where DeepFEDformer adds multiple Multi-Layer Perceptron (MLP) layers to FEDformer to improve the perception of deep features. The Relevance Amplification Module (RAM) is designed to receive geographic correlation information as a way to enhance its influence on the Seasonality component of the Decoder input and improve the performance of the attention mechanism in the model. Using the power generation data from PV plants distributed in 11 regions of Belgium as our case study, we evaluated the performance of RAD-FEDformer in predicting PV data at 48/96/192 time steps into the future for each plant. Compared to other Transformer family models, RAD-FEDformer achieved SOTA results by demonstrating significant R 2 improvements of 16.70%, 51.45%, and 23.50% over FEDformer, along with an average MSE reduction of 16.9%. We designed Ablation Experiments to validate and compare the performance of MLPs of different sizes based on the ETTm1 dataset and discussed the performance of RAM and its effect on the attention mechanism based on the PV dataset. The results show that our model enhances the performance of the attention mechanism in Multi-Site PV prediction scenarios, and conclude that the optimization of RAM is more effective in longer sequence predictions. • Propose a module for embedding geographic information into input data with low overhead. • Improving the FEDformer model using Multi-layer Perception. • Discuss the performance of the model on multivariate prediction task based on ETT and PV datasets. • Compare several popular Transformer family forecasting models in the case study. [ABSTRACT FROM AUTHOR]
- Subjects :
- *FORECASTING
*MULTILAYER perceptrons
*PREDICTION models
Subjects
Details
- Language :
- English
- ISSN :
- 03605442
- Volume :
- 299
- Database :
- Academic Search Index
- Journal :
- Energy
- Publication Type :
- Academic Journal
- Accession number :
- 177248966
- Full Text :
- https://doi.org/10.1016/j.energy.2024.131479