Back to Search Start Over

Windformer: A novel 4D high-resolution system for multi-step wind speed vector forecasting based on temporal shifted window multi-head self-attention.

Authors :
He, Jinhua
Hu, Zechun
Wang, Songpo
Mujeeb, Asad
Yang, Pengwei
Source :
Energy. Nov2024, Vol. 310, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Accurate wind speed forecasting (WSF) plays a pivotal role in anticipating the power output of wind farms. Nevertheless, the stochastic and variable nature of wind speed presents a significant challenge in achieving accurate WSF. Hence, this study proposes a novel 4D high-resolution system, Windformer, for wind speed vector forecasting (WSVF). Windformer combines the ability of convolutional neural networks (CNNs) for feature extraction and transformers based on attention mechanisms for information fusion. In the Windformer, the input and output layers are mainly composed of 3D CNNs. The input layer is employed for feature extraction and information compression, while the output layer is tasked with recovering the WSV field. The key components of the model consist of encoders and decoders built upon temporal shifted window multi-head self-attention. This architecture is capable of effectively integrating spatio-temporal information. Trained on 39 years of regional reanalysis data, Windformer obtains the most accurate forecast results compared to 5 WSF baseline models. Most importantly, within the next 6 hours, Windformer demonstrates higher accuracy compared to the High-Resolution Deterministic Forecast (HRES) from the European Centre for Medium-Range Weather Forecasts (ECMWF). • Windformer enables high-resolution regional-level wind speed vector predictions. • Windformer surpasses baseline models in forecasting wind speeds for the upcoming 1 to 48 h. • Training Windformer only requires two RTX4090 GPUs. • Windformer shows an MAE about 0.05–0.2 m/s lower than HRES within the next 6-hour wind speed forecast. • Windformer efficiently handles spatio-temporal information using temporal shifted window multi-head self-attention. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
03605442
Volume :
310
Database :
Academic Search Index
Journal :
Energy
Publication Type :
Academic Journal
Accession number :
180133586
Full Text :
https://doi.org/10.1016/j.energy.2024.133206