Structure Preserving Neural Networks: A Case Study in the Entropy Closure of the Boltzmann Equation

Steffen Schotthöfer, Tianbai Xiao, Martin Frank, Cory Hauck
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:19406-19433, 2022.

Abstract

In this paper, we explore applications of deep learning in statistical physics. We choose the Boltzmann equation as a typical example, where neural networks serve as a closure to its moment system. We present two types of neural networks to embed the convexity of entropy and to preserve the minimum entropy principle and intrinsic mathematical structures of the moment system of the Boltzmann equation. We derive an error bound for the generalization gap of convex neural networks which are trained in Sobolev norm and use the results to construct data sampling methods for neural network training. Numerical experiments demonstrate that the neural entropy closure is significantly faster than classical optimizers while maintaining sufficient accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-schotthofer22a, title = {Structure Preserving Neural Networks: A Case Study in the Entropy Closure of the Boltzmann Equation}, author = {Schotth{\"o}fer, Steffen and Xiao, Tianbai and Frank, Martin and Hauck, Cory}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {19406--19433}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/schotthofer22a/schotthofer22a.pdf}, url = {https://proceedings.mlr.press/v162/schotthofer22a.html}, abstract = {In this paper, we explore applications of deep learning in statistical physics. We choose the Boltzmann equation as a typical example, where neural networks serve as a closure to its moment system. We present two types of neural networks to embed the convexity of entropy and to preserve the minimum entropy principle and intrinsic mathematical structures of the moment system of the Boltzmann equation. We derive an error bound for the generalization gap of convex neural networks which are trained in Sobolev norm and use the results to construct data sampling methods for neural network training. Numerical experiments demonstrate that the neural entropy closure is significantly faster than classical optimizers while maintaining sufficient accuracy.} }
Endnote
%0 Conference Paper %T Structure Preserving Neural Networks: A Case Study in the Entropy Closure of the Boltzmann Equation %A Steffen Schotthöfer %A Tianbai Xiao %A Martin Frank %A Cory Hauck %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-schotthofer22a %I PMLR %P 19406--19433 %U https://proceedings.mlr.press/v162/schotthofer22a.html %V 162 %X In this paper, we explore applications of deep learning in statistical physics. We choose the Boltzmann equation as a typical example, where neural networks serve as a closure to its moment system. We present two types of neural networks to embed the convexity of entropy and to preserve the minimum entropy principle and intrinsic mathematical structures of the moment system of the Boltzmann equation. We derive an error bound for the generalization gap of convex neural networks which are trained in Sobolev norm and use the results to construct data sampling methods for neural network training. Numerical experiments demonstrate that the neural entropy closure is significantly faster than classical optimizers while maintaining sufficient accuracy.
APA
Schotthöfer, S., Xiao, T., Frank, M. & Hauck, C.. (2022). Structure Preserving Neural Networks: A Case Study in the Entropy Closure of the Boltzmann Equation. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:19406-19433 Available from https://proceedings.mlr.press/v162/schotthofer22a.html.

Related Material