FaDIn: Fast Discretized Inference for Hawkes Processes with General Parametric Kernels

Guillaume Staerman, Cédric Allain, Alexandre Gramfort, Thomas Moreau
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:32575-32597, 2023.

Abstract

Temporal point processes (TPP) are a natural tool for modeling event-based data. Among all TPP models, Hawkes processes have proven to be the most widely used, mainly due to their adequate modeling for various applications, particularly when considering exponential or non-parametric kernels. Although non-parametric kernels are an option, such models require large datasets. While exponential kernels are more data efficient and relevant for specific applications where events immediately trigger more events, they are ill-suited for applications where latencies need to be estimated, such as in neuroscience. This work aims to offer an efficient solution to TPP inference using general parametric kernels with finite support. The developed solution consists of a fast $\ell_2$ gradient-based solver leveraging a discretized version of the events. After theoretically supporting the use of discretization, the statistical and computational efficiency of the novel approach is demonstrated through various numerical experiments. Finally, the method’s effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG). Given the use of general parametric kernels, results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-staerman23a, title = {{F}a{DI}n: Fast Discretized Inference for {H}awkes Processes with General Parametric Kernels}, author = {Staerman, Guillaume and Allain, C\'{e}dric and Gramfort, Alexandre and Moreau, Thomas}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {32575--32597}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/staerman23a/staerman23a.pdf}, url = {https://proceedings.mlr.press/v202/staerman23a.html}, abstract = {Temporal point processes (TPP) are a natural tool for modeling event-based data. Among all TPP models, Hawkes processes have proven to be the most widely used, mainly due to their adequate modeling for various applications, particularly when considering exponential or non-parametric kernels. Although non-parametric kernels are an option, such models require large datasets. While exponential kernels are more data efficient and relevant for specific applications where events immediately trigger more events, they are ill-suited for applications where latencies need to be estimated, such as in neuroscience. This work aims to offer an efficient solution to TPP inference using general parametric kernels with finite support. The developed solution consists of a fast $\ell_2$ gradient-based solver leveraging a discretized version of the events. After theoretically supporting the use of discretization, the statistical and computational efficiency of the novel approach is demonstrated through various numerical experiments. Finally, the method’s effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG). Given the use of general parametric kernels, results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.} }
Endnote
%0 Conference Paper %T FaDIn: Fast Discretized Inference for Hawkes Processes with General Parametric Kernels %A Guillaume Staerman %A Cédric Allain %A Alexandre Gramfort %A Thomas Moreau %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-staerman23a %I PMLR %P 32575--32597 %U https://proceedings.mlr.press/v202/staerman23a.html %V 202 %X Temporal point processes (TPP) are a natural tool for modeling event-based data. Among all TPP models, Hawkes processes have proven to be the most widely used, mainly due to their adequate modeling for various applications, particularly when considering exponential or non-parametric kernels. Although non-parametric kernels are an option, such models require large datasets. While exponential kernels are more data efficient and relevant for specific applications where events immediately trigger more events, they are ill-suited for applications where latencies need to be estimated, such as in neuroscience. This work aims to offer an efficient solution to TPP inference using general parametric kernels with finite support. The developed solution consists of a fast $\ell_2$ gradient-based solver leveraging a discretized version of the events. After theoretically supporting the use of discretization, the statistical and computational efficiency of the novel approach is demonstrated through various numerical experiments. Finally, the method’s effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG). Given the use of general parametric kernels, results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
APA
Staerman, G., Allain, C., Gramfort, A. & Moreau, T.. (2023). FaDIn: Fast Discretized Inference for Hawkes Processes with General Parametric Kernels. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:32575-32597 Available from https://proceedings.mlr.press/v202/staerman23a.html.

Related Material