Back to Search
Start Over
Leveraging Simplex Gradient Variance and Bias Reduction for Black-Box Optimization of Noisy and Costly Functions
- Source :
- IEEE Access, Vol 13, Pp 14304-14316 (2025)
- Publication Year :
- 2025
- Publisher :
- IEEE, 2025.
-
Abstract
- Gradient variance errors in gradient-based search methods are largely mitigated using momentum, however the bias gradient errors may fail the numerical search methods in reaching the true optimum. We investigate the reduction in both bias and variance errors attributed to the simplex gradient estimated from noisy function measurements, in favor of the finite-differences gradient, when both are used for black-box optimization methods. Regardless of the simplex orientation, while reducing the gradient bias error owned to several factors such as truncation, numerical or measurement noise, we claim and verify that, under relaxed assumptions about the underlying function’s differentiability, the estimated gradient by the simplex method has at most half the variance of the finite-difference gradient. The findings are validated with two comprehensive and representative case studies, one related to the minimization of a nonlinear feedback control system cost function and the second related to a deep machine learning classification problem whose hyperparameters are tuned. We conclude that in up to medium-size practical black-box optimization problems with unknown variable domains and where the noisy function measurements are expensive, a simplex gradient-based search is an attractive option.
Details
- Language :
- English
- ISSN :
- 21693536
- Volume :
- 13
- Database :
- Directory of Open Access Journals
- Journal :
- IEEE Access
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.ba0787d35f14259acb177dd3c30cadb
- Document Type :
- article
- Full Text :
- https://doi.org/10.1109/ACCESS.2025.3529915