1. DPBA-WGAN: A Vector-Valued Differential Private Bilateral Alternative Scheme on WGAN for Image Generation
- Author
-
Danhua Wu, Wenyong Zhang, and Panfeng Zhang
- Subjects
Data generation ,deep learning model ,differential privacy ,noisy perturbation ,WGAN ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The large amount of sensitive personal information used in deep learning models has attracted considerable attention for privacy security. Sensitive data may be memorialized or encoded into the parameters or the generation of the Wasserstein Generative Adversarial Networks (WGAN), which can be prevented by implementing privacy-preserving algorithms during the parameter training process. Meanwhile, the model is also expected to obtain effective generated results. We propose a vector-valued differential private bilateral alternative (DPBA) algorithm, a novel perturbation method for the training process. The vector-valued Gaussian (VVG) noise involving functional structure information is injected into the WGAN to generate data with privacy protection, and the model is verified to satisfy differential privacy. The bilateral alternative noise can eventually randomly perturb the gradient and generates informative feature-rich samples. The dynamic noise and vector-based perturbation approach ensure privacy strength. After extensive evaluation, our algorithm outperformed state-of-the-art techniques in terms of usability metrics for all validation datasets. The downstream classification accuracy for the generated Mnist was 97.04%, whereas that for the Fashion-Mnist dataset was 80.91%. Mnist improved the average accuracy of the neural network classifier by at least 16.81%, and Fashion-Mnist by at least 3.55%. In the multichannel generation tasks, the binary classification accuracy improved by at least 10.4% compared to CelebA, and the accuracy of the Street View House Numbers SVHN was as high as 86.1%. The perturbation method proved highly resilient to gradient attack recovery under simulated gradient attacks.
- Published
- 2023
- Full Text
- View/download PDF