How Inverse Conditional Flows Can Serve as a Substitute for Distributional Regression

Lucas Kook, Chris Kolb, Philipp Schiele, Daniel Dold, Marcel Arpogaus, Cornelius Fritz, Philipp Baumann, Philipp Kopper, Tobias Pielok, Emilio Dorigatti, David Rügamer
Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence, PMLR 244:2029-2046, 2024.

Abstract

Neural network representations of simple models, such as linear regression, are being studied increasingly to better understand the underlying principles of deep learning algorithms. However, neural representations of distributional regression models, such as the Cox model, have received little attention so far. We close this gap by proposing a framework for distributional regression using inverse flow transformations (DRIFT), which includes neural representations of the aforementioned models. We empirically demonstrate that the neural representations of models in DRIFT can serve as a substitute for their classical statistical counterparts in several applications involving continuous, ordered, time-series, and survival outcomes. We confirm that models in DRIFT empirically match the performance of several statistical methods in terms of estimation of partial effects, prediction, and aleatoric uncertainty quantification. DRIFT covers both interpretable statistical models and flexible neural networks opening up new avenues in both statistical modeling and deep learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v244-kook24a, title = {How Inverse Conditional Flows Can Serve as a Substitute for Distributional Regression}, author = {Kook, Lucas and Kolb, Chris and Schiele, Philipp and Dold, Daniel and Arpogaus, Marcel and Fritz, Cornelius and Baumann, Philipp and Kopper, Philipp and Pielok, Tobias and Dorigatti, Emilio and R\"ugamer, David}, booktitle = {Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence}, pages = {2029--2046}, year = {2024}, editor = {Kiyavash, Negar and Mooij, Joris M.}, volume = {244}, series = {Proceedings of Machine Learning Research}, month = {15--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v244/main/assets/kook24a/kook24a.pdf}, url = {https://proceedings.mlr.press/v244/kook24a.html}, abstract = {Neural network representations of simple models, such as linear regression, are being studied increasingly to better understand the underlying principles of deep learning algorithms. However, neural representations of distributional regression models, such as the Cox model, have received little attention so far. We close this gap by proposing a framework for distributional regression using inverse flow transformations (DRIFT), which includes neural representations of the aforementioned models. We empirically demonstrate that the neural representations of models in DRIFT can serve as a substitute for their classical statistical counterparts in several applications involving continuous, ordered, time-series, and survival outcomes. We confirm that models in DRIFT empirically match the performance of several statistical methods in terms of estimation of partial effects, prediction, and aleatoric uncertainty quantification. DRIFT covers both interpretable statistical models and flexible neural networks opening up new avenues in both statistical modeling and deep learning.} }
Endnote
%0 Conference Paper %T How Inverse Conditional Flows Can Serve as a Substitute for Distributional Regression %A Lucas Kook %A Chris Kolb %A Philipp Schiele %A Daniel Dold %A Marcel Arpogaus %A Cornelius Fritz %A Philipp Baumann %A Philipp Kopper %A Tobias Pielok %A Emilio Dorigatti %A David Rügamer %B Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2024 %E Negar Kiyavash %E Joris M. Mooij %F pmlr-v244-kook24a %I PMLR %P 2029--2046 %U https://proceedings.mlr.press/v244/kook24a.html %V 244 %X Neural network representations of simple models, such as linear regression, are being studied increasingly to better understand the underlying principles of deep learning algorithms. However, neural representations of distributional regression models, such as the Cox model, have received little attention so far. We close this gap by proposing a framework for distributional regression using inverse flow transformations (DRIFT), which includes neural representations of the aforementioned models. We empirically demonstrate that the neural representations of models in DRIFT can serve as a substitute for their classical statistical counterparts in several applications involving continuous, ordered, time-series, and survival outcomes. We confirm that models in DRIFT empirically match the performance of several statistical methods in terms of estimation of partial effects, prediction, and aleatoric uncertainty quantification. DRIFT covers both interpretable statistical models and flexible neural networks opening up new avenues in both statistical modeling and deep learning.
APA
Kook, L., Kolb, C., Schiele, P., Dold, D., Arpogaus, M., Fritz, C., Baumann, P., Kopper, P., Pielok, T., Dorigatti, E. & Rügamer, D.. (2024). How Inverse Conditional Flows Can Serve as a Substitute for Distributional Regression. Proceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 244:2029-2046 Available from https://proceedings.mlr.press/v244/kook24a.html.

Related Material