A Framework for Private Matrix Analysis in Sliding Window Model

Jalaj Upadhyay, Sarvagya Upadhyay
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:10465-10475, 2021.

Abstract

We perform a rigorous study of private matrix analysis when only the last $W$ updates to matrices are considered useful for analysis. We show the existing framework in the non-private setting is not robust to noise required for privacy. We then propose a framework robust to noise and use it to give first efficient $o(W)$ space differentially private algorithms for spectral approximation, principal component analysis (PCA), multi-response linear regression, sparse PCA, and non-negative PCA. Prior to our work, no such result was known for sparse and non-negative differentially private PCA even in the static data setting. We also give a lower bound to demonstrate the cost of privacy in the sliding window model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-upadhyay21a, title = {A Framework for Private Matrix Analysis in Sliding Window Model}, author = {Upadhyay, Jalaj and Upadhyay, Sarvagya}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {10465--10475}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/upadhyay21a/upadhyay21a.pdf}, url = {https://proceedings.mlr.press/v139/upadhyay21a.html}, abstract = {We perform a rigorous study of private matrix analysis when only the last $W$ updates to matrices are considered useful for analysis. We show the existing framework in the non-private setting is not robust to noise required for privacy. We then propose a framework robust to noise and use it to give first efficient $o(W)$ space differentially private algorithms for spectral approximation, principal component analysis (PCA), multi-response linear regression, sparse PCA, and non-negative PCA. Prior to our work, no such result was known for sparse and non-negative differentially private PCA even in the static data setting. We also give a lower bound to demonstrate the cost of privacy in the sliding window model.} }
Endnote
%0 Conference Paper %T A Framework for Private Matrix Analysis in Sliding Window Model %A Jalaj Upadhyay %A Sarvagya Upadhyay %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-upadhyay21a %I PMLR %P 10465--10475 %U https://proceedings.mlr.press/v139/upadhyay21a.html %V 139 %X We perform a rigorous study of private matrix analysis when only the last $W$ updates to matrices are considered useful for analysis. We show the existing framework in the non-private setting is not robust to noise required for privacy. We then propose a framework robust to noise and use it to give first efficient $o(W)$ space differentially private algorithms for spectral approximation, principal component analysis (PCA), multi-response linear regression, sparse PCA, and non-negative PCA. Prior to our work, no such result was known for sparse and non-negative differentially private PCA even in the static data setting. We also give a lower bound to demonstrate the cost of privacy in the sliding window model.
APA
Upadhyay, J. & Upadhyay, S.. (2021). A Framework for Private Matrix Analysis in Sliding Window Model. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:10465-10475 Available from https://proceedings.mlr.press/v139/upadhyay21a.html.

Related Material