Old Techniques in Differentially Private Linear Regression
[edit]
Proceedings of the 30th International Conference on Algorithmic Learning Theory, PMLR 98:789827, 2019.
Abstract
We introduce three novel differentially private algorithms that approximate the $2^{\rm nd}$moment matrix of the data. These algorithms, which in contrast to existing algorithms always output positivedefinite matrices, correspond to existing techniques in linear regression literature. Thus these techniques have an immediate interpretation and all results known about these techniques are straightforwardly applicable to the outputs of these algorithms. More specifically, we discuss the following three techniques. (i) For Ridge Regression, we propose setting the regularization coefficient so that by approximating the solution using JohnsonLindenstrauss transform we preserve privacy. (ii) We show that adding a batch of $d+O(\epsilon^{2})$ random samples to our data preserves differential privacy. (iii) We show that sampling the $2^{\rm nd}$moment matrix from a Bayesian posterior inverseWishart distribution is differentially private. We also give utility bounds for our algorithms and compare them with the existing “Analyze Gauss” algorithm of Dwork et al.
Related Material


