Hypothesis Testing Interpretations and Renyi Differential Privacy


Borja Balle, Gilles Barthe, Marco Gaboardi, Justin Hsu, Tetsuya Sato ;
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2496-2506, 2020.


Differential privacy is a de facto standard in data privacy, with applicationsin the public and private sectors. One way of explaining differential privacy,which is particularly appealing to statistician and social scientists, is bymeans of its statistical hypothesis testing interpretation. Informally, onecannot effectively test whether a specific individual has contributed her databy observing the output of a private mechanism—any test cannot have bothhigh significance and high power.In this paper, we identify some conditions under which a privacy definition given in terms of a statistical divergence satisfies a similar interpretation.These conditions are useful to analyze the distinguishing power of divergencesand we use them to study the hypothesis testing interpretation of somerelaxations of differential privacy based on Renyi divergence. Ouranalysis also results in an improved conversion rule between these definitionsand differential privacy.

Related Material