Back to Search Start Over

Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization.

Authors :
P. Tseng
S. Yun
Source :
Journal of Optimization Theory & Applications. Mar2009, Vol. 140 Issue 3, p513-535. 23p.
Publication Year :
2009

Abstract

We consider the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints. We propose a block-coordinate gradient descent method for solving this problem, with the coordinate block chosen by a Gauss-Southwell- q rule based on sufficient predicted descent. We establish global convergence to first-order stationarity for this method and, under a local error bound assumption, linear rate of convergence. If f is convex with Lipschitz continuous gradient, then the method terminates in O( n2/ ε) iterations with an ε-optimal solution. If P is separable, then the Gauss-Southwell- q rule is implementable in O( n) operations when m=1 and in O( n2) operations when m>1. In the special case of support vector machines training, for which f is convex quadratic, P is separable, and m=1, this complexity bound is comparable to the best known bound for decomposition methods. If f is convex, then, by gradually reducing the weight on P to zero, the method can be adapted to solve the bilevel problem of minimizing P over the set of minima of f+ δ X, where X denotes the closure of the feasible set. This has application in the least 1-norm solution of maximum-likelihood estimation. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00223239
Volume :
140
Issue :
3
Database :
Academic Search Index
Journal :
Journal of Optimization Theory & Applications
Publication Type :
Academic Journal
Accession number :
36502896
Full Text :
https://doi.org/10.1007/s10957-008-9458-3