Back to Search Start Over

A class of accelerated GADMM-based method for multi-block nonconvex optimization problems.

Authors :
Zhang, Kunyu
Shao, Hu
Wu, Ting
Wang, Xiaoquan
Source :
Numerical Algorithms. Apr2024, p1-43.
Publication Year :
2024

Abstract

To improve the computational efficiency, based on the generalized alternating direction method of multipliers (GADMM), we consider a class of accelerated method for solving multi-block nonconvex and nonsmooth optimization problems. First, we linearize the smooth part of the objective function and add proximal terms in subproblems, resulting in the proximal linearized GADMM. Then, we introduce an inertial technique and give the inertial proximal linearized GADMM. The convergence of the regularized augmented Lagrangian function sequence is proved under some appropriate assumptions. When some component functions of the objective function are convex, we use the error bound condition and obtain that the sequences generated by the algorithms locally converge to the critical point in a <italic>R</italic>-linear rate. Moreover, we apply the proposed algorithms to SCAD and robust PCA problems to verify the efficiency of the algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10171398
Database :
Academic Search Index
Journal :
Numerical Algorithms
Publication Type :
Academic Journal
Accession number :
176530482
Full Text :
https://doi.org/10.1007/s11075-024-01821-z