Back to Search
Start Over
AugGAN: Cross Domain Adaptation with GAN-Based Data Augmentation
- Source :
- Computer Vision – ECCV 2018 ISBN: 9783030012397, ECCV (9)
- Publication Year :
- 2018
- Publisher :
- Springer International Publishing, 2018.
-
Abstract
- Deep learning based image-to-image translation methods aim at learning the joint distribution of the two domains and finding transformations between them. Despite recent GAN (Generative Adversarial Network) based methods have shown compelling results, they are prone to fail at preserving image-objects and maintaining translation consistency, which reduces their practicality on tasks such as generating large-scale training data for different domains. To address this problem, we purpose a structure-aware image-to-image translation network, which is composed of encoders, generators, discriminators and parsing nets for the two domains, respectively, in a unified framework. The purposed network generates more visually plausible images compared to competing methods on different image-translation tasks. In addition, we quantitatively evaluate different methods by training Faster-RCNN and YOLO with datasets generated from the image-translation results and demonstrate significant improvement on the detection accuracies by using the proposed image-object preserving network.
- Subjects :
- Domain adaptation
Parsing
Computer science
business.industry
Deep learning
02 engineering and technology
010501 environmental sciences
computer.software_genre
Machine learning
Translation (geometry)
01 natural sciences
Object detection
Consistency (database systems)
Joint probability distribution
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Artificial intelligence
business
computer
Encoder
0105 earth and related environmental sciences
Subjects
Details
- ISBN :
- 978-3-030-01239-7
- ISBNs :
- 9783030012397
- Database :
- OpenAIRE
- Journal :
- Computer Vision – ECCV 2018 ISBN: 9783030012397, ECCV (9)
- Accession number :
- edsair.doi...........e0bf5fb518218029b5bd1135b046d6e8
- Full Text :
- https://doi.org/10.1007/978-3-030-01240-3_44