Back to Search Start Over

Differentiable Game Mechanics

Authors :
Letcher, Alistair
Balduzzi, David
Racaniere, Sebastien
Martens, James
Foerster, Jakob
Tuyls, Karl
Graepel, Thore
Source :
Journal of Machine Learning Research (JMLR), v20 (84) 1-40, 2019
Publication Year :
2019

Abstract

Deep learning is built on the foundational guarantee that gradient descent on an objective function converges to local minima. Unfortunately, this guarantee fails in settings, such as generative adversarial nets, that exhibit multiple interacting losses. The behavior of gradient-based methods in games is not well understood -- and is becoming increasingly important as adversarial and multi-objective architectures proliferate. In this paper, we develop new tools to understand and control the dynamics in n-player differentiable games. The key result is to decompose the game Jacobian into two components. The first, symmetric component, is related to potential games, which reduce to gradient descent on an implicit function. The second, antisymmetric component, relates to Hamiltonian games, a new class of games that obey a conservation law akin to conservation laws in classical mechanical systems. The decomposition motivates Symplectic Gradient Adjustment (SGA), a new algorithm for finding stable fixed points in differentiable games. Basic experiments show SGA is competitive with recently proposed algorithms for finding stable fixed points in GANs -- while at the same time being applicable to, and having guarantees in, much more general cases.<br />Comment: JMLR 2019, journal version of arXiv:1802.05642

Details

Database :
arXiv
Journal :
Journal of Machine Learning Research (JMLR), v20 (84) 1-40, 2019
Publication Type :
Report
Accession number :
edsarx.1905.04926
Document Type :
Working Paper