Back to Search
Start Over
Why current differential privacy schemes are inapplicable for correlated data publishing?
- Source :
-
World Wide Web . 2021, Vol. 24 Issue 1, p1-23. 23p. - Publication Year :
- 2021
-
Abstract
- Although data analysis and mining technologies can efficiently provide intelligent and personalized services to us, data owners may not always be willing to share their true data because of privacy concerns. Recently, differential privacy (DP) technology has achieved a good trade-off between data utility and privacy guarantee by publishing noisy outputs. Nonetheless, DP still has a risk of privacy leakage when handling correlated data directly. Current schemes attempt to extend DP to publish correlated data, but are faced with the challenge of violating DP or low-level data utility. In this paper, we try to explore the essential cause of this inapplicability. Specifically, we suppose that this inapplicability is caused by the different correlations between noise and original data. To verify our supposition, we propose the notion of Correlation-Distinguishability Attack (CDA) to separate IID (Independent and Identically Distributed) noise from correlated data. Furthermore, taking time series as an example, we design an optimum filter to realize CDA in practical applications. Experimental results support our supposition and show that, the privacy degree of current approaches has a degradation under CDA. [ABSTRACT FROM AUTHOR]
- Subjects :
- *PRIVACY
*DATA mining
*TIME series analysis
*DATA analysis
*HYPOTHESIS
Subjects
Details
- Language :
- English
- ISSN :
- 1386145X
- Volume :
- 24
- Issue :
- 1
- Database :
- Academic Search Index
- Journal :
- World Wide Web
- Publication Type :
- Academic Journal
- Accession number :
- 148320520
- Full Text :
- https://doi.org/10.1007/s11280-020-00825-8