[edit]
Renyi Differentially Private ERM for Smooth Objectives
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2037-2046, 2019.
Abstract
In this paper, we present a Renyi Differentially Private stochastic gradient descent (SGD) algorithm for convex empirical risk minimization. The algorithm uses output perturbation and leverages randomness inside SGD, which creates a "randomized sensitivity", in order to reduce the amount of noise that is added. One of the benefits of output perturbation is that we can incorporate a periodic averaging step that serves to further reduce sensitivity while improving accuracy (reducing the well-known oscillating behavior of SGD near the optimum). Renyi Differential Privacy can be used to provide (epsilon, delta)-differential privacy guarantees and hence provide a comparison with prior work. An empirical evaluation demonstrates that the proposed method outperforms prior methods on differentially private ERM.