Back to Search Start Over

Scalable Balanced Training of Conditional Generative Adversarial Neural Networks on Image Data

Authors :
Pasini, Massimiliano Lupo
Gabbi, Vittorio
Yin, Junqi
Perotto, Simona
Laanait, Nouamane
Source :
Journal of Supercomputing, 2021
Publication Year :
2021

Abstract

We propose a distributed approach to train deep convolutional generative adversarial neural network (DC-CGANs) models. Our method reduces the imbalance between generator and discriminator by partitioning the training data according to data labels, and enhances scalability by performing a parallel training where multiple generators are concurrently trained, each one of them focusing on a single data label. Performance is assessed in terms of inception score and image quality on MNIST, CIFAR10, CIFAR100, and ImageNet1k datasets, showing a significant improvement in comparison to state-of-the-art techniques to training DC-CGANs. Weak scaling is attained on all the four datasets using up to 1,000 processes and 2,000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.

Details

Database :
arXiv
Journal :
Journal of Supercomputing, 2021
Publication Type :
Report
Accession number :
edsarx.2102.10485
Document Type :
Working Paper
Full Text :
https://doi.org/10.1007/s11227-021-03808-2