1. Perspective maximum likelihood-type estimation via proximal decomposition
- Author
-
Patrick L. Combettes and Christian L. Müller
- Subjects
Statistics and Probability ,Mathematical optimization ,Scale (ratio) ,Duality (optimization) ,Mathematics - Statistics Theory ,Statistics Theory (math.ST) ,perspective function ,90C25 ,Robust regression ,Lasso (statistics) ,FOS: Mathematics ,62J02 ,concomitant M-estimator ,Mathematics ,46N30 ,proximal algorithm ,Estimator ,Statistical model ,Concomitant M-estimator ,Convex Optimization ,Heteroscedastic Model ,Perspective Function ,Proximal Algorithm ,Robust Regression ,Convex optimization ,Statistics::Computation ,robust regression ,heteroscedastic model ,62P10 ,Statistics, Probability and Uncertainty ,Convex function - Abstract
We introduce a flexible optimization model for maximum likelihood-type estimation (M-estimation) that encompasses and generalizes a large class of existing statistical models, including Huber’s concomitant M-estimator, Owen’s Huber/Berhu concomitant estimator, the scaled lasso, support vector machine regression, and penalized estimation with structured sparsity. The model, termed perspective M-estimation, leverages the observation that convex M-estimators with concomitant scale as well as various regularizers are instances of perspective functions, a construction that extends a convex function to a jointly convex one in terms of an additional scale variable. These nonsmooth functions are shown to be amenable to proximal analysis, which leads to principled and provably convergent optimization algorithms via proximal splitting. We derive novel proximity operators for several perspective functions of interest via a geometrical approach based on duality. We then devise a new proximal splitting algorithm to solve the proposed M-estimation problem and establish the convergence of both the scale and regression iterates it produces to a solution. Numerical experiments on synthetic and real-world data illustrate the broad applicability of the proposed framework.
- Published
- 2020