Back to Search Start Over

Convergence of Subtangent-Based Relaxations of Nonlinear Programs.

Authors :
Cao, Huiyi
Song, Yingkai
Khan, Kamil A.
Source :
Processes; Apr2019, Vol. 7 Issue 4, p221-221, 1p
Publication Year :
2019

Abstract

Convex relaxations of functions are used to provide bounding information to deterministic global optimization methods for nonconvex systems. To be useful, these relaxations must converge rapidly to the original system as the considered domain shrinks. This article examines the convergence rates of convex outer approximations for functions and nonlinear programs (NLPs), constructed using affine subtangents of an existing convex relaxation scheme. It is shown that these outer approximations inherit rapid second-order pointwise convergence from the original scheme under certain assumptions. To support this analysis, the notion of second-order pointwise convergence is extended to constrained optimization problems, and general sufficient conditions for guaranteeing this convergence are developed. The implications are discussed. An implementation of subtangent-based relaxations of NLPs in Julia is discussed and is applied to example problems for illustration. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
22279717
Volume :
7
Issue :
4
Database :
Complementary Index
Journal :
Processes
Publication Type :
Academic Journal
Accession number :
136175790
Full Text :
https://doi.org/10.3390/pr7040221