Back to Search
Start Over
BSD-GAN: Branched Generative Adversarial Network for Scale-Disentangled Representation Learning and Image Synthesis.
- Source :
-
IEEE Transactions on Image Processing . 2020, Vol. 29, p9073-9083. 11p. - Publication Year :
- 2020
-
Abstract
- We introduce BSD-GAN, a novel multi-branch and scale-disentangled training method which enables unconditional Generative Adversarial Networks (GANs) to learn image representations at multiple scales, benefiting a wide range of generation and editing tasks. The key feature of BSD-GAN is that it is trained in multiple branches, progressively covering both the breadth and depth of the network, as resolutions of the training images increase to reveal finer-scale features. Specifically, each noise vector, as input to the generator network of BSD-GAN, is deliberately split into several sub-vectors, each corresponding to, and is trained to learn, image representations at a particular scale. During training, we progressively “de-freeze” the sub-vectors, one at a time, as a new set of higher-resolution images is employed for training and more network layers are added. A consequence of such an explicit sub-vector designation is that we can directly manipulate and even combine latent (sub-vector) codes which model different feature scales. Extensive experiments demonstrate the effectiveness of our training method in scale-disentangled learning of image representations and synthesis of novel image contents, without any extra labels and without compromising quality of the synthesized high-resolution images. We further demonstrate several image generation and manipulation applications enabled or improved by BSD-GAN. [ABSTRACT FROM AUTHOR]
- Subjects :
- *GENERATIVE adversarial networks
*IMAGE representation
*GALLIUM nitride
Subjects
Details
- Language :
- English
- ISSN :
- 10577149
- Volume :
- 29
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Image Processing
- Publication Type :
- Academic Journal
- Accession number :
- 170078571
- Full Text :
- https://doi.org/10.1109/TIP.2020.3014608