Optimizing persistent homology based functions

Mathieu Carriere, Frederic Chazal, Marc Glisse, Yuichi Ike, Hariprasad Kannan, Yuhei Umeda
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:1294-1303, 2021.

Abstract

Solving optimization tasks based on functions and losses with a topological flavor is a very active and growing field of research in data science and Topological Data Analysis, with applications in non-convex optimization, statistics and machine learning. However, the approaches proposed in the literature are usually anchored to a specific application and/or topological construction, and do not come with theoretical guarantees. To address this issue, we study the differentiability of a general map associated with the most common topological construction, that is, the persistence map. Building on real analytic geometry arguments, we propose a general framework that allows us to define and compute gradients for persistence-based functions in a very simple way. We also provide a simple, explicit and sufficient condition for convergence of stochastic subgradient methods for such functions. This result encompasses all the constructions and applications of topological optimization in the literature. Finally, we provide associated code, that is easy to handle and to mix with other non-topological methods and constraints, as well as some experiments showcasing the versatility of our approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-carriere21a, title = {Optimizing persistent homology based functions}, author = {Carriere, Mathieu and Chazal, Frederic and Glisse, Marc and Ike, Yuichi and Kannan, Hariprasad and Umeda, Yuhei}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {1294--1303}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/carriere21a/carriere21a.pdf}, url = {https://proceedings.mlr.press/v139/carriere21a.html}, abstract = {Solving optimization tasks based on functions and losses with a topological flavor is a very active and growing field of research in data science and Topological Data Analysis, with applications in non-convex optimization, statistics and machine learning. However, the approaches proposed in the literature are usually anchored to a specific application and/or topological construction, and do not come with theoretical guarantees. To address this issue, we study the differentiability of a general map associated with the most common topological construction, that is, the persistence map. Building on real analytic geometry arguments, we propose a general framework that allows us to define and compute gradients for persistence-based functions in a very simple way. We also provide a simple, explicit and sufficient condition for convergence of stochastic subgradient methods for such functions. This result encompasses all the constructions and applications of topological optimization in the literature. Finally, we provide associated code, that is easy to handle and to mix with other non-topological methods and constraints, as well as some experiments showcasing the versatility of our approach.} }
Endnote
%0 Conference Paper %T Optimizing persistent homology based functions %A Mathieu Carriere %A Frederic Chazal %A Marc Glisse %A Yuichi Ike %A Hariprasad Kannan %A Yuhei Umeda %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-carriere21a %I PMLR %P 1294--1303 %U https://proceedings.mlr.press/v139/carriere21a.html %V 139 %X Solving optimization tasks based on functions and losses with a topological flavor is a very active and growing field of research in data science and Topological Data Analysis, with applications in non-convex optimization, statistics and machine learning. However, the approaches proposed in the literature are usually anchored to a specific application and/or topological construction, and do not come with theoretical guarantees. To address this issue, we study the differentiability of a general map associated with the most common topological construction, that is, the persistence map. Building on real analytic geometry arguments, we propose a general framework that allows us to define and compute gradients for persistence-based functions in a very simple way. We also provide a simple, explicit and sufficient condition for convergence of stochastic subgradient methods for such functions. This result encompasses all the constructions and applications of topological optimization in the literature. Finally, we provide associated code, that is easy to handle and to mix with other non-topological methods and constraints, as well as some experiments showcasing the versatility of our approach.
APA
Carriere, M., Chazal, F., Glisse, M., Ike, Y., Kannan, H. & Umeda, Y.. (2021). Optimizing persistent homology based functions. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:1294-1303 Available from https://proceedings.mlr.press/v139/carriere21a.html.

Related Material