1. Stochastic Bigger Subspace Algorithms for Nonconvex Stochastic Optimization
- Author
-
Gonglin Yuan, Yingjie Zhou, Liping Wang, and Qingyuan Yang
- Subjects
Stochastic subspace algorithm ,nonconvex function ,convergence property ,machine learning ,complexity analysis ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of $f$ and its gradient are often not easily to be solved, or the $F(\cdot, \xi)$ is normally not given clearly and (or) the distribution function $P$ is equivocal. Then an effective optimization algorithm is successfully designed and used to solve this problem that is an interesting work. This paper designs stochastic bigger subspace algorithms for solving nonconvex stochastic optimization problems. A general framework for such algorithm is presented for convergence analysis, where the so-called the sufficient descent property, the trust region feature, and the global convergence of the stationary points are proved under the suitable conditions. In the worst-case, we will turn out that the complexity is competitive under a given accuracy parameter. We will proved that the $SFO$ -calls complexity of the presented algorithm with diminishing steplength is $O\left({\epsilon ^{-{\frac {1}{1-\beta }}}}\right)$ and the $SFO$ -calls complexity of the given algorithm with random constant steplength is $O(\epsilon ^{-2})$ respectively, where $\beta \in (0.5,1)$ and $\epsilon $ is accuracy and the needed conditions are weaker than the quasi-Newton methods and the normal conjugate gradient algorithms. The detail algorithm framework with variance reduction is also proposed for experiments and the nonconvex binary classification problem is done to demonstrate the performance of the given algorithm.
- Published
- 2021
- Full Text
- View/download PDF