Back to Search Start Over

On the Convergence and Complexity of the Stochastic Central Finite-Difference Based Gradient Estimation Methods

Authors :
Bollapragada, Raghu
Karamanli, Cem
Publication Year :
2025

Abstract

This paper presents an algorithmic framework for solving unconstrained stochastic optimization problems using only stochastic function evaluations. We employ central finite-difference based gradient estimation methods to approximate the gradients and dynamically control the accuracy of these approximations by adjusting the sample sizes used in stochastic realizations. We analyze the theoretical properties of the proposed framework on nonconvex functions. Our analysis yields sublinear convergence results to the neighborhood of the solution, and establishes the optimal worst-case iteration complexity ($\mathcal{O}(\epsilon^{-1})$) and sample complexity ($\mathcal{O}(\epsilon^{-2})$) for each gradient estimation method to achieve an $\epsilon$-accurate solution. Finally, we demonstrate the performance of the proposed framework and the quality of the gradient estimation methods through numerical experiments on nonlinear least squares problems.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.06610
Document Type :
Working Paper