Back to Search Start Over

Self-Supervised GAN to Counter Forgetting

Authors :
Chen, Ting
Zhai, Xiaohua
Houlsby, Neil
Publication Year :
2018

Abstract

GANs involve training two networks in an adversarial game, where each network's task depends on its adversary. Recently, several works have framed GAN training as an online or continual learning problem. We focus on the discriminator, which must perform classification under an (adversarially) shifting data distribution. When trained on sequential tasks, neural networks exhibit \emph{forgetting}. For GANs, discriminator forgetting leads to training instability. To counter forgetting, we encourage the discriminator to maintain useful representations by adding a self-supervision. Conditional GANs have a similar effect using labels. However, our self-supervised GAN does not require labels, and closes the performance gap between conditional and unconditional models. We show that, in doing so, the self-supervised discriminator learns better representations than regular GANs.<br />Comment: NeurIPS'18 Continual Learning workshop

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1810.11598
Document Type :
Working Paper