TRADE: Transfer of Distributions between External Conditions with Normalizing Flows

Stefan Wahl, Armand Rousselot, Felix Draxler, Ullrich Koethe
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3196-3204, 2025.

Abstract

Modeling distributions that depend on external control parameters is a common scenario in diverse applications like molecular simulations, where system properties like temperature affect molecular configurations. Despite the relevance of these applications, existing solutions are unsatisfactory as they require severely restricted model architectures or rely on energy-based training, which is prone to instability. We introduce TRADE, which overcomes these limitations by formulating the learning process as a boundary value problem. By initially training the model for a specific condition using either i.i.d. samples or backward KL training, we establish a boundary distribution. We then propagate this information across other conditions using the gradient of the unnormalized density with respect to the external parameter. This formulation, akin to the principles of physics-informed neural networks, allows us to efficiently learn parameter-dependent distributions without restrictive assumptions. Experimentally, we demonstrate that TRADE achieves excellent results in a wide range of applications, ranging from Bayesian inference and molecular simulations to physical lattice models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-wahl25a, title = {TRADE: Transfer of Distributions between External Conditions with Normalizing Flows}, author = {Wahl, Stefan and Rousselot, Armand and Draxler, Felix and Koethe, Ullrich}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3196--3204}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/wahl25a/wahl25a.pdf}, url = {https://proceedings.mlr.press/v258/wahl25a.html}, abstract = {Modeling distributions that depend on external control parameters is a common scenario in diverse applications like molecular simulations, where system properties like temperature affect molecular configurations. Despite the relevance of these applications, existing solutions are unsatisfactory as they require severely restricted model architectures or rely on energy-based training, which is prone to instability. We introduce TRADE, which overcomes these limitations by formulating the learning process as a boundary value problem. By initially training the model for a specific condition using either i.i.d. samples or backward KL training, we establish a boundary distribution. We then propagate this information across other conditions using the gradient of the unnormalized density with respect to the external parameter. This formulation, akin to the principles of physics-informed neural networks, allows us to efficiently learn parameter-dependent distributions without restrictive assumptions. Experimentally, we demonstrate that TRADE achieves excellent results in a wide range of applications, ranging from Bayesian inference and molecular simulations to physical lattice models.} }
Endnote
%0 Conference Paper %T TRADE: Transfer of Distributions between External Conditions with Normalizing Flows %A Stefan Wahl %A Armand Rousselot %A Felix Draxler %A Ullrich Koethe %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-wahl25a %I PMLR %P 3196--3204 %U https://proceedings.mlr.press/v258/wahl25a.html %V 258 %X Modeling distributions that depend on external control parameters is a common scenario in diverse applications like molecular simulations, where system properties like temperature affect molecular configurations. Despite the relevance of these applications, existing solutions are unsatisfactory as they require severely restricted model architectures or rely on energy-based training, which is prone to instability. We introduce TRADE, which overcomes these limitations by formulating the learning process as a boundary value problem. By initially training the model for a specific condition using either i.i.d. samples or backward KL training, we establish a boundary distribution. We then propagate this information across other conditions using the gradient of the unnormalized density with respect to the external parameter. This formulation, akin to the principles of physics-informed neural networks, allows us to efficiently learn parameter-dependent distributions without restrictive assumptions. Experimentally, we demonstrate that TRADE achieves excellent results in a wide range of applications, ranging from Bayesian inference and molecular simulations to physical lattice models.
APA
Wahl, S., Rousselot, A., Draxler, F. & Koethe, U.. (2025). TRADE: Transfer of Distributions between External Conditions with Normalizing Flows. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3196-3204 Available from https://proceedings.mlr.press/v258/wahl25a.html.

Related Material