Back to Search Start Over

NfgTransformer: Equivariant Representation Learning for Normal-form Games

Authors :
Liu, Siqi
Marris, Luke
Piliouras, Georgios
Gemp, Ian
Heess, Nicolas
Publication Year :
2024

Abstract

Normal-form games (NFGs) are the fundamental model of strategic interaction. We study their representation using neural networks. We describe the inherent equivariance of NFGs -- any permutation of strategies describes an equivalent game -- as well as the challenges this poses for representation learning. We then propose the NfgTransformer architecture that leverages this equivariance, leading to state-of-the-art performance in a range of game-theoretic tasks including equilibrium-solving, deviation gain estimation and ranking, with a common approach to NFG representation. We show that the resulting model is interpretable and versatile, paving the way towards deep learning systems capable of game-theoretic reasoning when interacting with humans and with each other.<br />Comment: Published at ICLR 2024. Open-sourced at https://github.com/google-deepmind/nfg_transformer

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.08393
Document Type :
Working Paper