1. Dual-Discriminator Generative Adversarial Network with Uniform Color Information Extraction for Color Constancy.
- Author
-
Huiting Xu, Zhenshan Tan, Zhijiang Li, and Shuying Lyu
- Subjects
GENERATIVE adversarial networks ,DATA mining ,STRUCTURAL colors ,FEATURE extraction - Abstract
Generative adversarial network (GAN) has attracted extensive attention in color constancy because it allows pixel-wise supervision. However, the misinterpretation of color features and the low sensitivity of the discriminator caused by strong correlation of multiple features limit the learning capability of GAN. To address these issues, we propose a dual-discriminator generative adversarial network (DDGAN), which includes a color feature learning (CFL) module, a feature fusion discriminator (FFD) module and a global consistency constraint (GCC) module. First, CFL pays attention to regions with uniform color to enable the generator to learn distinguishable color information. Second, FFD is a discriminator module that contains two feature extraction branches; one extracts color features and the other extracts globally correlated features. These features are then fused to weaken structural features and enhance the discriminator's sensitivity to color features. Finally, GCC imposes global consistency constraints to reconsider the structural features weakened by FFD and unify structural features and color features, aiming to obtain more uniform images of colors and contents. Extensive experiments on the ColorChecker RECommended dataset, NUS 8-Camera and Cube datasets show that our DDGAN outperforms other GAN-based methods in terms of five popular metrics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF