Differentially Private Regression with Gaussian Processes

Michael T. Smith, Mauricio A. Álvarez, Max Zwiessele, Neil D. Lawrence
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1195-1203, 2018.

Abstract

A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-smith18a, title = {Differentially Private Regression with {G}aussian Processes}, author = {Michael T. Smith and Mauricio A. Álvarez and Max Zwiessele and Neil D. Lawrence}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1195--1203}, year = {2018}, editor = {Amos Storkey and Fernando Perez-Cruz}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/smith18a/smith18a.pdf}, url = { http://proceedings.mlr.press/v84/smith18a.html }, abstract = {A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.} }
Endnote
%0 Conference Paper %T Differentially Private Regression with Gaussian Processes %A Michael T. Smith %A Mauricio A. Álvarez %A Max Zwiessele %A Neil D. Lawrence %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-smith18a %I PMLR %P 1195--1203 %U http://proceedings.mlr.press/v84/smith18a.html %V 84 %X A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.
APA
Smith, M.T., Álvarez, M.A., Zwiessele, M. & Lawrence, N.D.. (2018). Differentially Private Regression with Gaussian Processes. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1195-1203 Available from http://proceedings.mlr.press/v84/smith18a.html .

Related Material