Back to Search
Start Over
A MEMORY EFFICIENT INCREMENTAL GRADIENT METHOD FOR REGULARIZED MINIMIZATION
- Source :
- Bulletin of the Korean Mathematical Society. 53:589-600
- Publication Year :
- 2016
- Publisher :
- The Korean Mathematical Society, 2016.
-
Abstract
- In this paper, we propose a new incremental gradient methodfor solving a regularized minimization problem whose objective is thesum of msmooth functions and a (possibly nonsmooth) convex function.This method uses an adaptive stepsize. Recently proposed incrementalgradient methods for a regularized minimization problem need O(mn)storage, where n is the number of variables. This is the drawback ofthem. But, the proposed new incremental gradient method requires onlyO(n) storage. 1. IntroductionIn this paper, we consider the regularized minimization problem whose formis(1) min x∈ℜ n F λ (x) := f(x) +λP(x),where λ > 0, P : ℜ n → (−∞,∞] is a proper, convex, lower semicontinuous(lsc) function [20], and(2) f(x) :=X mi=1 f i (x),where each function f i is real-valued and smooth (i.e., continuously differen-tiable) on an open subset of ℜ n containing domP = {x | P(x) < ∞}.The minimization problem (1) we consider arises in many applications suchas (supervised) learning [7, 12, 27], regression [17, 23], neural network training[11, 21, 29], and data mining/classification [5, 15, 22, 28]. For the l
- Subjects :
- 021103 operations research
Artificial neural network
General Mathematics
Minimization problem
0211 other engineering and technologies
Regular polygon
010103 numerical & computational mathematics
02 engineering and technology
Adaptive stepsize
01 natural sciences
Regularization (mathematics)
Combinatorics
Moving average
Minification
0101 mathematics
Gradient method
Mathematics
Subjects
Details
- ISSN :
- 10158634
- Volume :
- 53
- Database :
- OpenAIRE
- Journal :
- Bulletin of the Korean Mathematical Society
- Accession number :
- edsair.doi...........356f261c1626142333ab0647e53f7e8d