\bfΦ_\textrmFlow: Differentiable Simulations for PyTorch, TensorFlow and Jax

Philipp Holl, Nils Thuerey
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:18515-18546, 2024.

Abstract

Differentiable processes have proven an invaluable tool for machine learning (ML) in scientific and engineering settings, but most ML libraries are not primarily designed for such applications. We present \Phi_\textrm{Flow}, a Python toolkit that seamlessly integrates with PyTorch, TensorFlow, Jax and NumPy, simplifying the process of writing differentiable simulation code at every step. \Phi_\textrm{Flow} provides many essential features that go beyond the capabilities of the base libraries, such as differential operators, boundary conditions, the ability to write dimensionality-agnostic code, floating-point precision management, fully differentiable preconditioned (sparse) linear solves, automatic matrix generation via function tracing, integration of SciPy optimizers, simulation vectorization, and visualization tools. At the same time, \Phi_\textrm{Flow} inherits all important traits of the base ML libraries, such as GPU / TPU support, just-in-time compilation, and automatic differentiation. Put together, these features drastically simplify scientific code like PDE or ODE solvers on grids or unstructured meshes, and \Phi_\textrm{Flow} even includes out-of-the-box support for fluid simulations. \Phi_\textrm{Flow} has been used in various publications and as a ground-truth solver in multiple scientific data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-holl24a, title = {$\bf{Φ}_\textrm{Flow}$: Differentiable Simulations for {P}y{T}orch, {T}ensor{F}low and Jax}, author = {Holl, Philipp and Thuerey, Nils}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {18515--18546}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/holl24a/holl24a.pdf}, url = {https://proceedings.mlr.press/v235/holl24a.html}, abstract = {Differentiable processes have proven an invaluable tool for machine learning (ML) in scientific and engineering settings, but most ML libraries are not primarily designed for such applications. We present $\Phi_\textrm{Flow}$, a Python toolkit that seamlessly integrates with PyTorch, TensorFlow, Jax and NumPy, simplifying the process of writing differentiable simulation code at every step. $\Phi_\textrm{Flow}$ provides many essential features that go beyond the capabilities of the base libraries, such as differential operators, boundary conditions, the ability to write dimensionality-agnostic code, floating-point precision management, fully differentiable preconditioned (sparse) linear solves, automatic matrix generation via function tracing, integration of SciPy optimizers, simulation vectorization, and visualization tools. At the same time, $\Phi_\textrm{Flow}$ inherits all important traits of the base ML libraries, such as GPU / TPU support, just-in-time compilation, and automatic differentiation. Put together, these features drastically simplify scientific code like PDE or ODE solvers on grids or unstructured meshes, and $\Phi_\textrm{Flow}$ even includes out-of-the-box support for fluid simulations. $\Phi_\textrm{Flow}$ has been used in various publications and as a ground-truth solver in multiple scientific data sets.} }
Endnote
%0 Conference Paper %T $\bfΦ_\textrmFlow$: Differentiable Simulations for PyTorch, TensorFlow and Jax %A Philipp Holl %A Nils Thuerey %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-holl24a %I PMLR %P 18515--18546 %U https://proceedings.mlr.press/v235/holl24a.html %V 235 %X Differentiable processes have proven an invaluable tool for machine learning (ML) in scientific and engineering settings, but most ML libraries are not primarily designed for such applications. We present $\Phi_\textrm{Flow}$, a Python toolkit that seamlessly integrates with PyTorch, TensorFlow, Jax and NumPy, simplifying the process of writing differentiable simulation code at every step. $\Phi_\textrm{Flow}$ provides many essential features that go beyond the capabilities of the base libraries, such as differential operators, boundary conditions, the ability to write dimensionality-agnostic code, floating-point precision management, fully differentiable preconditioned (sparse) linear solves, automatic matrix generation via function tracing, integration of SciPy optimizers, simulation vectorization, and visualization tools. At the same time, $\Phi_\textrm{Flow}$ inherits all important traits of the base ML libraries, such as GPU / TPU support, just-in-time compilation, and automatic differentiation. Put together, these features drastically simplify scientific code like PDE or ODE solvers on grids or unstructured meshes, and $\Phi_\textrm{Flow}$ even includes out-of-the-box support for fluid simulations. $\Phi_\textrm{Flow}$ has been used in various publications and as a ground-truth solver in multiple scientific data sets.
APA
Holl, P. & Thuerey, N.. (2024). $\bfΦ_\textrmFlow$: Differentiable Simulations for PyTorch, TensorFlow and Jax. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:18515-18546 Available from https://proceedings.mlr.press/v235/holl24a.html.

Related Material