Back to Search Start Over

Ensemble Learning for Blending Gridded Satellite and Gauge-Measured Precipitation Data.

Authors :
Papacharalampous, Georgia
Tyralis, Hristos
Doulamis, Nikolaos
Doulamis, Anastasios
Source :
Remote Sensing. Oct2023, Vol. 15 Issue 20, p4912. 19p.
Publication Year :
2023

Abstract

Regression algorithms are regularly used for improving the accuracy of satellite precipitation products. In this context, satellite precipitation and topography data are the predictor variables, and gauged-measured precipitation data are the dependent variables. Alongside this, it is increasingly recognised in many fields that combinations of algorithms through ensemble learning can lead to substantial predictive performance improvements. Still, a sufficient number of ensemble learners for improving the accuracy of satellite precipitation products and their large-scale comparison are currently missing from the literature. In this study, we work towards filling in this specific gap by proposing 11 new ensemble learners in the field and by extensively comparing them. We apply the ensemble learners to monthly data from the PERSIANN (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks) and IMERG (Integrated Multi-satellitE Retrievals for GPM) gridded datasets that span over a 15-year period and over the entire contiguous United States (CONUS). We also use gauge-measured precipitation data from the Global Historical Climatology Network monthly database, version 2 (GHCNm). The ensemble learners combine the predictions of six machine learning regression algorithms (base learners), namely the multivariate adaptive regression splines (MARS), multivariate adaptive polynomial splines (poly-MARS), random forests (RF), gradient boosting machines (GBM), extreme gradient boosting (XGBoost) and Bayesian regularized neural networks (BRNN), and each of them is based on a different combiner. The combiners include the equal-weight combiner, the median combiner, two best learners and seven variants of a sophisticated stacking method. The latter stacks a regression algorithm on top of the base learners to combine their independent predictions. Its seven variants are defined by seven different regression algorithms, specifically the linear regression (LR) algorithm and the six algorithms also used as base learners. The results suggest that sophisticated stacking performs significantly better than the base learners, especially when applied using the LR algorithm. It also beats the simpler combination methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
15
Issue :
20
Database :
Academic Search Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
173337886
Full Text :
https://doi.org/10.3390/rs15204912