Sliced Wasserstein Kernel for Persistence Diagrams

Mathieu Carrière, Marco Cuturi, Steve Oudot
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:664-673, 2017.

Abstract

Persistence diagrams (PDs) play a key role in topological data analysis (TDA), in which they are routinely used to describe succinctly complex topological properties of complicated shapes. PDs enjoy strong stability properties and have proven their utility in various learning contexts. They do not, however, live in a space naturally endowed with a Hilbert structure and are usually compared with specific distances, such as the bottleneck distance. To incorporate PDs in a learning pipeline, several kernels have been proposed for PDs with a strong emphasis on the stability of the RKHS distance w.r.t. perturbations of the PDs. In this article, we use the Sliced Wasserstein approximation of the Wasserstein distance to define a new kernel for PDs, which is not only provably stable but also provably discriminative w.r.t. the Wasserstein distance $W^1_\infty$ between PDs. We also demonstrate its practicality, by developing an approximation technique to reduce kernel computation time, and show that our proposal compares favorably to existing kernels for PDs on several benchmarks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-carriere17a, title = {Sliced {W}asserstein Kernel for Persistence Diagrams}, author = {Mathieu Carri{\`e}re and Marco Cuturi and Steve Oudot}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {664--673}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/carriere17a/carriere17a.pdf}, url = {https://proceedings.mlr.press/v70/carriere17a.html}, abstract = {Persistence diagrams (PDs) play a key role in topological data analysis (TDA), in which they are routinely used to describe succinctly complex topological properties of complicated shapes. PDs enjoy strong stability properties and have proven their utility in various learning contexts. They do not, however, live in a space naturally endowed with a Hilbert structure and are usually compared with specific distances, such as the bottleneck distance. To incorporate PDs in a learning pipeline, several kernels have been proposed for PDs with a strong emphasis on the stability of the RKHS distance w.r.t. perturbations of the PDs. In this article, we use the Sliced Wasserstein approximation of the Wasserstein distance to define a new kernel for PDs, which is not only provably stable but also provably discriminative w.r.t. the Wasserstein distance $W^1_\infty$ between PDs. We also demonstrate its practicality, by developing an approximation technique to reduce kernel computation time, and show that our proposal compares favorably to existing kernels for PDs on several benchmarks.} }
Endnote
%0 Conference Paper %T Sliced Wasserstein Kernel for Persistence Diagrams %A Mathieu Carrière %A Marco Cuturi %A Steve Oudot %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-carriere17a %I PMLR %P 664--673 %U https://proceedings.mlr.press/v70/carriere17a.html %V 70 %X Persistence diagrams (PDs) play a key role in topological data analysis (TDA), in which they are routinely used to describe succinctly complex topological properties of complicated shapes. PDs enjoy strong stability properties and have proven their utility in various learning contexts. They do not, however, live in a space naturally endowed with a Hilbert structure and are usually compared with specific distances, such as the bottleneck distance. To incorporate PDs in a learning pipeline, several kernels have been proposed for PDs with a strong emphasis on the stability of the RKHS distance w.r.t. perturbations of the PDs. In this article, we use the Sliced Wasserstein approximation of the Wasserstein distance to define a new kernel for PDs, which is not only provably stable but also provably discriminative w.r.t. the Wasserstein distance $W^1_\infty$ between PDs. We also demonstrate its practicality, by developing an approximation technique to reduce kernel computation time, and show that our proposal compares favorably to existing kernels for PDs on several benchmarks.
APA
Carrière, M., Cuturi, M. & Oudot, S.. (2017). Sliced Wasserstein Kernel for Persistence Diagrams. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:664-673 Available from https://proceedings.mlr.press/v70/carriere17a.html.

Related Material