Back to Search Start Over

Deciding Differential Privacy for Programs with Finite Inputs and Outputs

Authors :
Barthe, Gilles
Chadha, Rohit
Jagannath, Vishal
Sistla, A. Prasad
Viswanathan, Mahesh
Publication Year :
2019

Abstract

Differential privacy is a de facto standard for statistical computations over databases that contain private data. The strength of differential privacy lies in a rigorous mathematical definition that guarantees individual privacy and yet allows for accurate statistical results. Thanks to its mathematical definition, differential privacy is also a natural target for formal analysis. A broad line of work uses logical methods for proving privacy. However, these methods are not complete, and only partially automated. A recent and complementary line of work uses statistical methods for finding privacy violations. However, the methods only provide statistical guarantees (but no proofs). We propose the first decision procedure for checking the differential privacy of a non-trivial class of probabilistic computations. Our procedure takes as input a program P parametrized by a privacy budget $\epsilon$, and either proves differential privacy for all possible values of $\epsilon$ or generates a counterexample. In addition, our procedure applies both to $\epsilon$-differential privacy and $(\epsilon,\delta)$-differential privacy. Technically, the decision procedure is based on a novel and judicious encoding of the semantics of programs in our class into a decidable fragment of the first-order theory of the reals with exponentiation. We implement our procedure and use it for (dis)proving privacy bounds for many well-known examples, including randomized response, histogram, report noisy max and sparse vector.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1910.04137
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3373718.339479