Back to Search Start Over

Exponential Convergence for Distributed Smooth Optimization Under the Restricted Secant Inequality Condition

Authors :
Yi, Xinlei
Zhang, Shengjun
Yang, Tao
Johansson, Karl H.
Chai, Tianyou
Publication Year :
2019

Abstract

This paper considers the distributed smooth optimization problem in which the objective is to minimize a global cost function formed by a sum of local smooth cost functions, by using local information exchange. The standard assumption for proving exponential/linear convergence of first-order methods is the strong convexity of the cost functions, which does not hold for many practical applications. In this paper, we first show that the continuous-time distributed primal-dual gradient algorithm converges to one global minimizer exponentially under the assumption that the global cost function satisfies the restricted secant inequality condition. This condition is weaker than the strong convexity condition since it does not require convexity and the global minimizers are not necessary to be unique. We then show that the discrete-time distributed primal-dual algorithm constructed by using the Euler's approximation method converges to one global minimizer linearly under the same condition. The theoretical results are illustrated by numerical simulations.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1909.03282
Document Type :
Working Paper