Back to Search Start Over

Gradient Estimation Schemes for Noisy Functions.

Authors :
Brekelmans, R. C. M.
Driessen, L. T.
Hamers, H. J. M.
Den Hertog, D.
Source :
Journal of Optimization Theory & Applications; Sep2005, Vol. 126 Issue 3, p529-551, 23p, 1 Chart, 4 Graphs
Publication Year :
2005

Abstract

In this paper, we analyze different schemes for obtaining gradient estimates when the underlying functions are noisy. Good gradient estimation is important e.g. for nonlinear programming solvers. As error criterion, we take the norm of the difference between the real and estimated gradients. The total error can be split into a deterministic error and a stochastic error. For three finite-difference schemes and two design of experiments (DoE) schemes, we analyze both the deterministic errors and stochastic errors. We derive also optimal stepsizes for each scheme, such that the total error is minimized. Some of the schemes have the nice property that this stepsize minimizes also the variance of the error. Based on these results, we show that, to obtain good gradient estimates for noisy functions, it is worthwhile to use DoE schemes. We recommend to implement such schemes in NLP solvers. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00223239
Volume :
126
Issue :
3
Database :
Complementary Index
Journal :
Journal of Optimization Theory & Applications
Publication Type :
Academic Journal
Accession number :
18371449
Full Text :
https://doi.org/10.1007/s10957-005-5496-2