Back to Search Start Over

Directional proximal point method for convex optimization

Authors :
Hwang, Wen-Liang
Yueh, Chang-Wei
Publication Year :
2023

Abstract

The use of proximal point operators for optimization can be computationally expensive when the dimensionality of a function (i.e., the number of variables) is high. In this study, we sought to reduce the cost of calculating proximal point operators by developing a directional operator in which the proximal regularization of a function along a specific direction is penalized. We used this operator in a novel approach to optimization, referred to as the directional proximal point method (Direction PPM). When using Direction PPM, the key to achieving convergence is the selection of direction sequences for directional proximal point operators. In this paper, we present the conditions/assumptions by which to derive directions capable of achieving global convergence for convex functions. Considered a light version of PPM, Direction PPM uses scalar optimization to derive a stable step-size via a direction envelope function and an auxiliary method to derive a direction sequence that satisfies the assumptions. This makes Direction PPM adaptable to a larger class of functions. Through applications to differentiable convex functions, we demonstrate that negative gradient directions at the current iterates could conceivably be used to achieve this end. We provide experimental results to illustrate the efficacy of Direction PPM in practice.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.02612
Document Type :
Working Paper