Back to Search Start Over

Exploring Jacobian Inexactness in Second-Order Methods for Variational Inequalities: Lower Bounds, Optimal Algorithms and Quasi-Newton Approximations

Authors :
Agafonov, Artem
Ostroukhov, Petr
Mozhaev, Roman
Yakovlev, Konstantin
Gorbunov, Eduard
Takáč, Martin
Gasnikov, Alexander
Kamzolov, Dmitry
Publication Year :
2024

Abstract

Variational inequalities represent a broad class of problems, including minimization and min-max problems, commonly found in machine learning. Existing second-order and high-order methods for variational inequalities require precise computation of derivatives, often resulting in prohibitively high iteration costs. In this work, we study the impact of Jacobian inaccuracy on second-order methods. For the smooth and monotone case, we establish a lower bound with explicit dependence on the level of Jacobian inaccuracy and propose an optimal algorithm for this key setting. When derivatives are exact, our method converges at the same rate as exact optimal second-order methods. To reduce the cost of solving the auxiliary problem, which arises in all high-order methods with global convergence, we introduce several Quasi-Newton approximations. Our method with Quasi-Newton updates achieves a global sublinear convergence rate. We extend our approach with a tensor generalization for inexact high-order derivatives and support the theory with experiments.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.15990
Document Type :
Working Paper