Laplace Transform Based Low-Complexity Learning of Continuous Markov Semigroups

Vladimir R Kostic, Karim Lounici, Hélène Halconruy, Timothée Devergne, Pietro Novelli, Massimiliano Pontil
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:31560-31589, 2025.

Abstract

Markov processes serve as universal models for many real-world random processes. This paper presents a data-driven approach to learning these models through the spectral decomposition of the infinitesimal generator (IG) of the Markov semigroup. Its unbounded nature complicates traditional methods such as vector-valued regression and Hilbert-Schmidt operator analysis. Existing techniques, including physics-informed kernel regression, are computationally expensive and limited in scope, with no recovery guarantees for transfer operator methods when the time-lag is small. We propose a novel method leveraging the IG’s resolvent, characterized by the Laplace transform of transfer operators. This approach is robust to time-lag variations, ensuring accurate eigenvalue learning even for small time-lags. Our statistical analysis applies to a broader class of Markov processes than current methods while reducing computational complexity from quadratic to linear in the state dimension. Finally, we demonstrate our theoretical findings in several experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-kostic25a, title = {{L}aplace Transform Based Low-Complexity Learning of Continuous {M}arkov Semigroups}, author = {Kostic, Vladimir R and Lounici, Karim and Halconruy, H\'{e}l\`{e}ne and Devergne, Timoth\'{e}e and Novelli, Pietro and Pontil, Massimiliano}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {31560--31589}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/kostic25a/kostic25a.pdf}, url = {https://proceedings.mlr.press/v267/kostic25a.html}, abstract = {Markov processes serve as universal models for many real-world random processes. This paper presents a data-driven approach to learning these models through the spectral decomposition of the infinitesimal generator (IG) of the Markov semigroup. Its unbounded nature complicates traditional methods such as vector-valued regression and Hilbert-Schmidt operator analysis. Existing techniques, including physics-informed kernel regression, are computationally expensive and limited in scope, with no recovery guarantees for transfer operator methods when the time-lag is small. We propose a novel method leveraging the IG’s resolvent, characterized by the Laplace transform of transfer operators. This approach is robust to time-lag variations, ensuring accurate eigenvalue learning even for small time-lags. Our statistical analysis applies to a broader class of Markov processes than current methods while reducing computational complexity from quadratic to linear in the state dimension. Finally, we demonstrate our theoretical findings in several experiments.} }
Endnote
%0 Conference Paper %T Laplace Transform Based Low-Complexity Learning of Continuous Markov Semigroups %A Vladimir R Kostic %A Karim Lounici %A Hélène Halconruy %A Timothée Devergne %A Pietro Novelli %A Massimiliano Pontil %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-kostic25a %I PMLR %P 31560--31589 %U https://proceedings.mlr.press/v267/kostic25a.html %V 267 %X Markov processes serve as universal models for many real-world random processes. This paper presents a data-driven approach to learning these models through the spectral decomposition of the infinitesimal generator (IG) of the Markov semigroup. Its unbounded nature complicates traditional methods such as vector-valued regression and Hilbert-Schmidt operator analysis. Existing techniques, including physics-informed kernel regression, are computationally expensive and limited in scope, with no recovery guarantees for transfer operator methods when the time-lag is small. We propose a novel method leveraging the IG’s resolvent, characterized by the Laplace transform of transfer operators. This approach is robust to time-lag variations, ensuring accurate eigenvalue learning even for small time-lags. Our statistical analysis applies to a broader class of Markov processes than current methods while reducing computational complexity from quadratic to linear in the state dimension. Finally, we demonstrate our theoretical findings in several experiments.
APA
Kostic, V.R., Lounici, K., Halconruy, H., Devergne, T., Novelli, P. & Pontil, M.. (2025). Laplace Transform Based Low-Complexity Learning of Continuous Markov Semigroups. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:31560-31589 Available from https://proceedings.mlr.press/v267/kostic25a.html.

Related Material