Phase Transitions, Distance Functions, and Implicit Neural Representations

Yaron Lipman
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:6702-6712, 2021.

Abstract

Representing surfaces as zero level sets of neural networks recently emerged as a powerful modeling paradigm, named Implicit Neural Representations (INRs), serving numerous downstream applications in geometric deep learning and 3D vision. Training INRs previously required choosing between occupancy and distance function representation and different losses with unknown limit behavior and/or bias. In this paper we draw inspiration from the theory of phase transitions of fluids and suggest a loss for training INRs that learns a density function that converges to a proper occupancy function, while its log transform converges to a distance function. Furthermore, we analyze the limit minimizer of this loss showing it satisfies the reconstruction constraints and has minimal surface perimeter, a desirable inductive bias for surface reconstruction. Training INRs with this new loss leads to state-of-the-art reconstructions on a standard benchmark.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-lipman21a, title = {Phase Transitions, Distance Functions, and Implicit Neural Representations}, author = {Lipman, Yaron}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {6702--6712}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/lipman21a/lipman21a.pdf}, url = {https://proceedings.mlr.press/v139/lipman21a.html}, abstract = {Representing surfaces as zero level sets of neural networks recently emerged as a powerful modeling paradigm, named Implicit Neural Representations (INRs), serving numerous downstream applications in geometric deep learning and 3D vision. Training INRs previously required choosing between occupancy and distance function representation and different losses with unknown limit behavior and/or bias. In this paper we draw inspiration from the theory of phase transitions of fluids and suggest a loss for training INRs that learns a density function that converges to a proper occupancy function, while its log transform converges to a distance function. Furthermore, we analyze the limit minimizer of this loss showing it satisfies the reconstruction constraints and has minimal surface perimeter, a desirable inductive bias for surface reconstruction. Training INRs with this new loss leads to state-of-the-art reconstructions on a standard benchmark.} }
Endnote
%0 Conference Paper %T Phase Transitions, Distance Functions, and Implicit Neural Representations %A Yaron Lipman %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-lipman21a %I PMLR %P 6702--6712 %U https://proceedings.mlr.press/v139/lipman21a.html %V 139 %X Representing surfaces as zero level sets of neural networks recently emerged as a powerful modeling paradigm, named Implicit Neural Representations (INRs), serving numerous downstream applications in geometric deep learning and 3D vision. Training INRs previously required choosing between occupancy and distance function representation and different losses with unknown limit behavior and/or bias. In this paper we draw inspiration from the theory of phase transitions of fluids and suggest a loss for training INRs that learns a density function that converges to a proper occupancy function, while its log transform converges to a distance function. Furthermore, we analyze the limit minimizer of this loss showing it satisfies the reconstruction constraints and has minimal surface perimeter, a desirable inductive bias for surface reconstruction. Training INRs with this new loss leads to state-of-the-art reconstructions on a standard benchmark.
APA
Lipman, Y.. (2021). Phase Transitions, Distance Functions, and Implicit Neural Representations. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:6702-6712 Available from https://proceedings.mlr.press/v139/lipman21a.html.

Related Material