PainFree Random Differential Privacy with Sensitivity Sampling
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:29502959, 2017.
Abstract
Popular approaches to differential privacy, such as the Laplace and exponential mechanisms, calibrate randomised smoothing through global sensitivity of the target nonprivate function. Bounding such sensitivity is often a prohibitively complex analytic calculation. As an alternative, we propose a straightforward sampler for estimating sensitivity of nonprivate mechanisms. Since our sensitivity estimates hold with high probability, any mechanism that would be $(\epsilon,\delta)$differentially private under bounded global sensitivity automatically achieves $(\epsilon,\delta,\gamma)$random differential privacy (Hall et al. 2012), without any targetspecific calculations required. We demonstrate on worked example learners how our usable approach adopts a naturallyrelaxed privacy guarantee, while achieving more accurate releases even for nonprivate functions that are blackbox computer programs.
Related Material


