Back to Search Start Over

DADA: Dual Averaging with Distance Adaptation

Authors :
Moshtaghifar, Mohammad
Rodomanov, Anton
Vankov, Daniil
Stich, Sebastian
Publication Year :
2025

Abstract

We present a novel universal gradient method for solving convex optimization problems. Our algorithm -- Dual Averaging with Distance Adaptation (DADA) -- is based on the classical scheme of dual averaging and dynamically adjusts its coefficients based on observed gradients and the distance between iterates and the starting point, eliminating the need for problem-specific parameters. DADA is a universal algorithm that simultaneously works for a broad spectrum of problem classes, provided the local growth of the objective function around its minimizer can be bounded. Particular examples of such problem classes are nonsmooth Lipschitz functions, Lipschitz-smooth functions, H\"older-smooth functions, functions with high-order Lipschitz derivative, quasi-self-concordant functions, and $(L_0,L_1)$-smooth functions. Crucially, DADA is applicable to both unconstrained and constrained problems, even when the domain is unbounded, without requiring prior knowledge of the number of iterations or desired accuracy.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.10258
Document Type :
Working Paper