Back to Search
Start Over
Knowledge Distillation for Face Photo–Sketch Synthesis.
- Source :
- IEEE Transactions on Neural Networks & Learning Systems; Feb2022, Vol. 33 Issue 2, p893-906, 14p
- Publication Year :
- 2022
-
Abstract
- Significant progress has been made with face photo–sketch synthesis in recent years due to the development of deep convolutional neural networks, particularly generative adversarial networks (GANs). However, the performance of existing methods is still limited because of the lack of training data (photo–sketch pairs). To address this challenge, we investigate the effect of knowledge distillation (KD) on training neural networks for the face photo–sketch synthesis task and propose an effective KD model to improve the performance of synthetic images. In particular, we utilize a teacher network trained on a large amount of data in a related task to separately learn knowledge of the face photo and knowledge of the face sketch and simultaneously transfer this knowledge to two student networks designed for the face photo–sketch synthesis task. In addition to assimilating the knowledge from the teacher network, the two student networks can mutually transfer their own knowledge to further enhance their learning. To further enhance the perception quality of the synthetic image, we propose a KD+ model that combines GANs with KD. The generator can produce images with more realistic textures and less noise under the guide of knowledge. Extensive experiments and a user study demonstrate the superiority of our models over the state-of-the-art methods. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 33
- Issue :
- 2
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 155108522
- Full Text :
- https://doi.org/10.1109/TNNLS.2020.3030536