ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology

Yajing Liu, Christina M Cole, Chris Peterson, Michael Kirby
Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML), PMLR 221:455-468, 2023.

Abstract

A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph. We show that while this dual graph is a coarse quantization of input space, it is sufficiently robust that it can be combined with persistent homology to detect homological signals of manifolds in the input space from samples. This property holds for a wide range of networks trained for a wide range of purposes that have nothing to do with this topological application. We found this feature to be surprising and interesting; we hope it will also be useful.

Cite this Paper


BibTeX
@InProceedings{pmlr-v221-liu23a, title = {ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology}, author = {Liu, Yajing and Cole, Christina M and Peterson, Chris and Kirby, Michael}, booktitle = {Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML)}, pages = {455--468}, year = {2023}, editor = {Doster, Timothy and Emerson, Tegan and Kvinge, Henry and Miolane, Nina and Papillon, Mathilde and Rieck, Bastian and Sanborn, Sophia}, volume = {221}, series = {Proceedings of Machine Learning Research}, month = {28 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v221/liu23a/liu23a.pdf}, url = {https://proceedings.mlr.press/v221/liu23a.html}, abstract = {A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph. We show that while this dual graph is a coarse quantization of input space, it is sufficiently robust that it can be combined with persistent homology to detect homological signals of manifolds in the input space from samples. This property holds for a wide range of networks trained for a wide range of purposes that have nothing to do with this topological application. We found this feature to be surprising and interesting; we hope it will also be useful.} }
Endnote
%0 Conference Paper %T ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology %A Yajing Liu %A Christina M Cole %A Chris Peterson %A Michael Kirby %B Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML) %C Proceedings of Machine Learning Research %D 2023 %E Timothy Doster %E Tegan Emerson %E Henry Kvinge %E Nina Miolane %E Mathilde Papillon %E Bastian Rieck %E Sophia Sanborn %F pmlr-v221-liu23a %I PMLR %P 455--468 %U https://proceedings.mlr.press/v221/liu23a.html %V 221 %X A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph. We show that while this dual graph is a coarse quantization of input space, it is sufficiently robust that it can be combined with persistent homology to detect homological signals of manifolds in the input space from samples. This property holds for a wide range of networks trained for a wide range of purposes that have nothing to do with this topological application. We found this feature to be surprising and interesting; we hope it will also be useful.
APA
Liu, Y., Cole, C.M., Peterson, C. & Kirby, M.. (2023). ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology. Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML), in Proceedings of Machine Learning Research 221:455-468 Available from https://proceedings.mlr.press/v221/liu23a.html.

Related Material