Sequential Gradient Descent and Quasi-Newton’s Method for Change-Point Analysis

Xianyang Zhang, Trisha Dawn
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:1129-1143, 2023.

Abstract

One common approach to detecting change-points is minimizing a cost function over possible numbers and locations of change-points. The framework includes several well-established procedures, such as the penalized likelihood and minimum description length. Such an approach requires finding the cost value repeatedly over different segments of the data set, which can be time-consuming when (i) the data sequence is long and (ii) obtaining the cost value involves solving a non-trivial optimization problem. This paper introduces a new sequential updating method (SE) to find the cost value effectively. The core idea is to update the cost value using the information from previous steps without re-optimizing the objective function. The new method is applied to change-point detection in generalized linear models and penalized regression. Numerical studies show that the new approach can be orders of magnitude faster than the Pruned Exact Linear Time (PELT) method without sacrificing estimation accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-zhang23b, title = {Sequential Gradient Descent and Quasi-Newton’s Method for Change-Point Analysis}, author = {Zhang, Xianyang and Dawn, Trisha}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {1129--1143}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/zhang23b/zhang23b.pdf}, url = {https://proceedings.mlr.press/v206/zhang23b.html}, abstract = {One common approach to detecting change-points is minimizing a cost function over possible numbers and locations of change-points. The framework includes several well-established procedures, such as the penalized likelihood and minimum description length. Such an approach requires finding the cost value repeatedly over different segments of the data set, which can be time-consuming when (i) the data sequence is long and (ii) obtaining the cost value involves solving a non-trivial optimization problem. This paper introduces a new sequential updating method (SE) to find the cost value effectively. The core idea is to update the cost value using the information from previous steps without re-optimizing the objective function. The new method is applied to change-point detection in generalized linear models and penalized regression. Numerical studies show that the new approach can be orders of magnitude faster than the Pruned Exact Linear Time (PELT) method without sacrificing estimation accuracy.} }
Endnote
%0 Conference Paper %T Sequential Gradient Descent and Quasi-Newton’s Method for Change-Point Analysis %A Xianyang Zhang %A Trisha Dawn %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-zhang23b %I PMLR %P 1129--1143 %U https://proceedings.mlr.press/v206/zhang23b.html %V 206 %X One common approach to detecting change-points is minimizing a cost function over possible numbers and locations of change-points. The framework includes several well-established procedures, such as the penalized likelihood and minimum description length. Such an approach requires finding the cost value repeatedly over different segments of the data set, which can be time-consuming when (i) the data sequence is long and (ii) obtaining the cost value involves solving a non-trivial optimization problem. This paper introduces a new sequential updating method (SE) to find the cost value effectively. The core idea is to update the cost value using the information from previous steps without re-optimizing the objective function. The new method is applied to change-point detection in generalized linear models and penalized regression. Numerical studies show that the new approach can be orders of magnitude faster than the Pruned Exact Linear Time (PELT) method without sacrificing estimation accuracy.
APA
Zhang, X. & Dawn, T.. (2023). Sequential Gradient Descent and Quasi-Newton’s Method for Change-Point Analysis. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:1129-1143 Available from https://proceedings.mlr.press/v206/zhang23b.html.

Related Material