Privately detecting changes in unknown distributions

Rachel Cummings, Sara Krehbiel, Yuliia Lut, Wanrong Zhang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2227-2237, 2020.

Abstract

The change-point detection problem seeks to identify distributional changes in streams of data. Increasingly, tools for change-point detection are applied in settings where data may be highly sensitive and formal privacy guarantees are required, such as identifying disease outbreaks based on hospital records, or IoT devices detecting activity within a home. Differential privacy has emerged as a powerful technique for enabling data analysis while preventing information leakage about individuals. Much of the prior work on change-point detection{—}including the only private algorithms for this problem{—}requires complete knowledge of the pre-change and post-change distributions, which is an unrealistic assumption for many practical applications of interest. This work develops differentially private algorithms for solving the change-point detection problem when the data distributions are unknown. Additionally, the data may be sampled from distributions that change smoothly over time, rather than fixed pre-change and post-change distributions. We apply our algorithms to detect changes in the linear trends of such data streams. Finally, we also provide experimental results to empirically validate the performance of our algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-cummings20a, title = {Privately detecting changes in unknown distributions}, author = {Cummings, Rachel and Krehbiel, Sara and Lut, Yuliia and Zhang, Wanrong}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2227--2237}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/cummings20a/cummings20a.pdf}, url = {https://proceedings.mlr.press/v119/cummings20a.html}, abstract = {The change-point detection problem seeks to identify distributional changes in streams of data. Increasingly, tools for change-point detection are applied in settings where data may be highly sensitive and formal privacy guarantees are required, such as identifying disease outbreaks based on hospital records, or IoT devices detecting activity within a home. Differential privacy has emerged as a powerful technique for enabling data analysis while preventing information leakage about individuals. Much of the prior work on change-point detection{—}including the only private algorithms for this problem{—}requires complete knowledge of the pre-change and post-change distributions, which is an unrealistic assumption for many practical applications of interest. This work develops differentially private algorithms for solving the change-point detection problem when the data distributions are unknown. Additionally, the data may be sampled from distributions that change smoothly over time, rather than fixed pre-change and post-change distributions. We apply our algorithms to detect changes in the linear trends of such data streams. Finally, we also provide experimental results to empirically validate the performance of our algorithms.} }
Endnote
%0 Conference Paper %T Privately detecting changes in unknown distributions %A Rachel Cummings %A Sara Krehbiel %A Yuliia Lut %A Wanrong Zhang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-cummings20a %I PMLR %P 2227--2237 %U https://proceedings.mlr.press/v119/cummings20a.html %V 119 %X The change-point detection problem seeks to identify distributional changes in streams of data. Increasingly, tools for change-point detection are applied in settings where data may be highly sensitive and formal privacy guarantees are required, such as identifying disease outbreaks based on hospital records, or IoT devices detecting activity within a home. Differential privacy has emerged as a powerful technique for enabling data analysis while preventing information leakage about individuals. Much of the prior work on change-point detection{—}including the only private algorithms for this problem{—}requires complete knowledge of the pre-change and post-change distributions, which is an unrealistic assumption for many practical applications of interest. This work develops differentially private algorithms for solving the change-point detection problem when the data distributions are unknown. Additionally, the data may be sampled from distributions that change smoothly over time, rather than fixed pre-change and post-change distributions. We apply our algorithms to detect changes in the linear trends of such data streams. Finally, we also provide experimental results to empirically validate the performance of our algorithms.
APA
Cummings, R., Krehbiel, S., Lut, Y. & Zhang, W.. (2020). Privately detecting changes in unknown distributions. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2227-2237 Available from https://proceedings.mlr.press/v119/cummings20a.html.

Related Material