Back to Search
Start Over
Privacy Amplification of Iterative Algorithms via Contraction Coefficients
- Publication Year :
- 2020
-
Abstract
- We investigate the framework of privacy amplification by iteration, recently proposed by Feldman et al., from an information-theoretic lens. We demonstrate that differential privacy guarantees of iterative mappings can be determined by a direct application of contraction coefficients derived from strong data processing inequalities for $f$-divergences. In particular, by generalizing the Dobrushin's contraction coefficient for total variation distance to an $f$-divergence known as $E_{\gamma}$-divergence, we derive tighter bounds on the differential privacy parameters of the projected noisy stochastic gradient descent algorithm with hidden intermediate updates.<br />Comment: Submitted for publication
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2001.06546
- Document Type :
- Working Paper