Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data

Tamal K. Dey, Shreyas N. Samaga
Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025), PMLR 321:147-165, 2026.

Abstract

In this paper, we propose Quasi Zigzag Persistent Homology (QZPH) as a framework for analyzing time-varying data by integrating multiparameter persistence and zigzag persistence. To this end, we introduce a stable topological invariant that captures both static and dynamic features at different scales. We present an algorithm to compute this invariant efficiently. We show that it enhances the machine learning models when applied to tasks such as sleep-stage detection, demonstrating its effectiveness in capturing the evolving patterns in time-varying datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v321-dey26a, title = {Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data}, author = {Dey, Tamal K. and Samaga, Shreyas N.}, booktitle = {Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025)}, pages = {147--165}, year = {2026}, editor = {Bernardez Gil, Guillermo and Black, Mitchell and Cloninger, Alexander and Doster, Timothy and Emerson, Tegan and Garcı́a-Rodondo, Ińes and Holtz, Chester and Kotak, Mit and Kvinge, Henry and Mishne, Gal and Papillon, Mathilde and Pouplin, Alison and Rainey, Katie and Rieck, Bastian and Telyatnikov, Lev and Yeats, Eric and Wang, Qingsong and Wang, Yusu and Wayland, Jeremy}, volume = {321}, series = {Proceedings of Machine Learning Research}, month = {01--02 Dec}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v321/main/assets/dey26a/dey26a.pdf}, url = {https://proceedings.mlr.press/v321/dey26a.html}, abstract = {In this paper, we propose Quasi Zigzag Persistent Homology (QZPH) as a framework for analyzing time-varying data by integrating multiparameter persistence and zigzag persistence. To this end, we introduce a stable topological invariant that captures both static and dynamic features at different scales. We present an algorithm to compute this invariant efficiently. We show that it enhances the machine learning models when applied to tasks such as sleep-stage detection, demonstrating its effectiveness in capturing the evolving patterns in time-varying datasets.} }
Endnote
%0 Conference Paper %T Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data %A Tamal K. Dey %A Shreyas N. Samaga %B Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025) %C Proceedings of Machine Learning Research %D 2026 %E Guillermo Bernardez Gil %E Mitchell Black %E Alexander Cloninger %E Timothy Doster %E Tegan Emerson %E Ińes Garcı́a-Rodondo %E Chester Holtz %E Mit Kotak %E Henry Kvinge %E Gal Mishne %E Mathilde Papillon %E Alison Pouplin %E Katie Rainey %E Bastian Rieck %E Lev Telyatnikov %E Eric Yeats %E Qingsong Wang %E Yusu Wang %E Jeremy Wayland %F pmlr-v321-dey26a %I PMLR %P 147--165 %U https://proceedings.mlr.press/v321/dey26a.html %V 321 %X In this paper, we propose Quasi Zigzag Persistent Homology (QZPH) as a framework for analyzing time-varying data by integrating multiparameter persistence and zigzag persistence. To this end, we introduce a stable topological invariant that captures both static and dynamic features at different scales. We present an algorithm to compute this invariant efficiently. We show that it enhances the machine learning models when applied to tasks such as sleep-stage detection, demonstrating its effectiveness in capturing the evolving patterns in time-varying datasets.
APA
Dey, T.K. & Samaga, S.N.. (2026). Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data. Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025), in Proceedings of Machine Learning Research 321:147-165 Available from https://proceedings.mlr.press/v321/dey26a.html.

Related Material