Constraining the Dynamics of Deep Probabilistic Models

Marco Lorenzi, Maurizio Filippone
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3227-3236, 2018.

Abstract

We introduce a novel generative formulation of deep probabilistic models implementing "soft" constraints on their function dynamics. In particular, we develop a flexible methodological framework where the modeled functions and derivatives of a given order are subject to inequality or equality constraints. We then characterize the posterior distribution over model and constraint parameters through stochastic variational inference. As a result, the proposed approach allows for accurate and scalable uncertainty quantification on the predictions and on all parameters. We demonstrate the application of equality constraints in the challenging problem of parameter inference in ordinary differential equation models, while we showcase the application of inequality constraints on the problem of monotonic regression of count data. The proposed approach is extensively tested in several experimental settings, leading to highly competitive results in challenging modeling applications, while offering high expressiveness, flexibility and scalability.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-lorenzi18a, title = {Constraining the Dynamics of Deep Probabilistic Models}, author = {Lorenzi, Marco and Filippone, Maurizio}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3227--3236}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/lorenzi18a/lorenzi18a.pdf}, url = {https://proceedings.mlr.press/v80/lorenzi18a.html}, abstract = {We introduce a novel generative formulation of deep probabilistic models implementing "soft" constraints on their function dynamics. In particular, we develop a flexible methodological framework where the modeled functions and derivatives of a given order are subject to inequality or equality constraints. We then characterize the posterior distribution over model and constraint parameters through stochastic variational inference. As a result, the proposed approach allows for accurate and scalable uncertainty quantification on the predictions and on all parameters. We demonstrate the application of equality constraints in the challenging problem of parameter inference in ordinary differential equation models, while we showcase the application of inequality constraints on the problem of monotonic regression of count data. The proposed approach is extensively tested in several experimental settings, leading to highly competitive results in challenging modeling applications, while offering high expressiveness, flexibility and scalability.} }
Endnote
%0 Conference Paper %T Constraining the Dynamics of Deep Probabilistic Models %A Marco Lorenzi %A Maurizio Filippone %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-lorenzi18a %I PMLR %P 3227--3236 %U https://proceedings.mlr.press/v80/lorenzi18a.html %V 80 %X We introduce a novel generative formulation of deep probabilistic models implementing "soft" constraints on their function dynamics. In particular, we develop a flexible methodological framework where the modeled functions and derivatives of a given order are subject to inequality or equality constraints. We then characterize the posterior distribution over model and constraint parameters through stochastic variational inference. As a result, the proposed approach allows for accurate and scalable uncertainty quantification on the predictions and on all parameters. We demonstrate the application of equality constraints in the challenging problem of parameter inference in ordinary differential equation models, while we showcase the application of inequality constraints on the problem of monotonic regression of count data. The proposed approach is extensively tested in several experimental settings, leading to highly competitive results in challenging modeling applications, while offering high expressiveness, flexibility and scalability.
APA
Lorenzi, M. & Filippone, M.. (2018). Constraining the Dynamics of Deep Probabilistic Models. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3227-3236 Available from https://proceedings.mlr.press/v80/lorenzi18a.html.

Related Material