Differentiable Change-point Detection With Temporal Point Processes

Paramita Koley, Harshavardhan Alimi, Shrey Singla, Sourangshu Bhattacharya, Niloy Ganguly, Abir De
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:6940-6955, 2023.

Abstract

In this paper, we consider the problem of global change-point detection in event sequence data, where both the event distributions and change-points are assumed to be unknown. For this problem, we propose a Log-likelihood Ratio based Global Change-point Detector, which observes the entire sequence and detects a prespecified number of change-points. Based on the Transformer Hawkes Process (THP), a well-known neural TPP framework, we develop DCPD, a differentiable change-point detector, along with maintaining distinct intensity and mark predictor for each partition. Further, we propose a sliding-window-based extension of DCPD to improve its scalability in terms of the number of events or change-points with minor sacrifices in performance. Experiments on synthetic datasets explore the effects of run-time, relative complexity, and other aspects of distributions on various properties of our changepoint detectors, namely robustness, detection accuracy, scalability, etc. under controlled environments. Finally, we perform experiments on six real-world temporal event sequences collected from diverse domains like health, geographical regions, etc., and show that our methods either outperform or perform comparably with the baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-koley23a, title = {Differentiable Change-point Detection With Temporal Point Processes}, author = {Koley, Paramita and Alimi, Harshavardhan and Singla, Shrey and Bhattacharya, Sourangshu and Ganguly, Niloy and De, Abir}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {6940--6955}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/koley23a/koley23a.pdf}, url = {https://proceedings.mlr.press/v206/koley23a.html}, abstract = {In this paper, we consider the problem of global change-point detection in event sequence data, where both the event distributions and change-points are assumed to be unknown. For this problem, we propose a Log-likelihood Ratio based Global Change-point Detector, which observes the entire sequence and detects a prespecified number of change-points. Based on the Transformer Hawkes Process (THP), a well-known neural TPP framework, we develop DCPD, a differentiable change-point detector, along with maintaining distinct intensity and mark predictor for each partition. Further, we propose a sliding-window-based extension of DCPD to improve its scalability in terms of the number of events or change-points with minor sacrifices in performance. Experiments on synthetic datasets explore the effects of run-time, relative complexity, and other aspects of distributions on various properties of our changepoint detectors, namely robustness, detection accuracy, scalability, etc. under controlled environments. Finally, we perform experiments on six real-world temporal event sequences collected from diverse domains like health, geographical regions, etc., and show that our methods either outperform or perform comparably with the baselines.} }
Endnote
%0 Conference Paper %T Differentiable Change-point Detection With Temporal Point Processes %A Paramita Koley %A Harshavardhan Alimi %A Shrey Singla %A Sourangshu Bhattacharya %A Niloy Ganguly %A Abir De %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-koley23a %I PMLR %P 6940--6955 %U https://proceedings.mlr.press/v206/koley23a.html %V 206 %X In this paper, we consider the problem of global change-point detection in event sequence data, where both the event distributions and change-points are assumed to be unknown. For this problem, we propose a Log-likelihood Ratio based Global Change-point Detector, which observes the entire sequence and detects a prespecified number of change-points. Based on the Transformer Hawkes Process (THP), a well-known neural TPP framework, we develop DCPD, a differentiable change-point detector, along with maintaining distinct intensity and mark predictor for each partition. Further, we propose a sliding-window-based extension of DCPD to improve its scalability in terms of the number of events or change-points with minor sacrifices in performance. Experiments on synthetic datasets explore the effects of run-time, relative complexity, and other aspects of distributions on various properties of our changepoint detectors, namely robustness, detection accuracy, scalability, etc. under controlled environments. Finally, we perform experiments on six real-world temporal event sequences collected from diverse domains like health, geographical regions, etc., and show that our methods either outperform or perform comparably with the baselines.
APA
Koley, P., Alimi, H., Singla, S., Bhattacharya, S., Ganguly, N. & De, A.. (2023). Differentiable Change-point Detection With Temporal Point Processes. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:6940-6955 Available from https://proceedings.mlr.press/v206/koley23a.html.

Related Material