Scalable Gradients and Variational Inference for Stochastic Differential Equations
[edit]
Proceedings of The 2nd Symposium on
Advances in Approximate Bayesian Inference, PMLR 118:128, 2020.
Abstract
We derive reversemode (or adjoint) automatic differentiation for solutions of stochastic differential equations (SDEs), allowing timeefficient and constantmemory computation of pathwise gradients, a continuoustime analogue of the reparameterization trick. Specifically, we construct a backward SDE whose solution is the gradient and provide conditions under which numerical solutions converge. We also combine our stochastic adjoint approach with a stochastic variational inference scheme for continuoustime SDE models, allowing us to learn distributions over functions using stochastic gradient descent. Our latent SDE model achieves competitive performance compared to existing approaches on time series modeling.
Related Material


