Latent Optimal Paths by Gumbel Propagation for Variational Bayesian Dynamic Programming

Xinlei Niu, Christian Walder, Jing Zhang, Charles Patrick Martin
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:38316-38343, 2024.

Abstract

We propose the stochastic optimal path which solves the classical optimal path problem by a probability-softening solution. This unified approach transforms a wide range of DP problems into directed acyclic graphs in which all paths follow a Gibbs distribution. We show the equivalence of the Gibbs distribution to a message-passing algorithm by the properties of the Gumbel distribution and give all the ingredients required for variational Bayesian inference of a latent path, namely Bayesian dynamic programming (BDP). We demonstrate the usage of BDP in the latent space of variational autoencoders (VAEs) and propose the BDP-VAE which captures structured sparse optimal paths as latent variables. This enables end-to-end training for generative tasks in which models rely on unobserved structural information. At last, we validate the behavior of our approach and showcase its applicability in two real-world applications: text-to-speech and singing voice synthesis. Our implementation code is available at https://github.com/XinleiNIU/LatentOptimalPathsBayesianDP.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-niu24b, title = {Latent Optimal Paths by {G}umbel Propagation for Variational {B}ayesian Dynamic Programming}, author = {Niu, Xinlei and Walder, Christian and Zhang, Jing and Martin, Charles Patrick}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {38316--38343}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/niu24b/niu24b.pdf}, url = {https://proceedings.mlr.press/v235/niu24b.html}, abstract = {We propose the stochastic optimal path which solves the classical optimal path problem by a probability-softening solution. This unified approach transforms a wide range of DP problems into directed acyclic graphs in which all paths follow a Gibbs distribution. We show the equivalence of the Gibbs distribution to a message-passing algorithm by the properties of the Gumbel distribution and give all the ingredients required for variational Bayesian inference of a latent path, namely Bayesian dynamic programming (BDP). We demonstrate the usage of BDP in the latent space of variational autoencoders (VAEs) and propose the BDP-VAE which captures structured sparse optimal paths as latent variables. This enables end-to-end training for generative tasks in which models rely on unobserved structural information. At last, we validate the behavior of our approach and showcase its applicability in two real-world applications: text-to-speech and singing voice synthesis. Our implementation code is available at https://github.com/XinleiNIU/LatentOptimalPathsBayesianDP.} }
Endnote
%0 Conference Paper %T Latent Optimal Paths by Gumbel Propagation for Variational Bayesian Dynamic Programming %A Xinlei Niu %A Christian Walder %A Jing Zhang %A Charles Patrick Martin %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-niu24b %I PMLR %P 38316--38343 %U https://proceedings.mlr.press/v235/niu24b.html %V 235 %X We propose the stochastic optimal path which solves the classical optimal path problem by a probability-softening solution. This unified approach transforms a wide range of DP problems into directed acyclic graphs in which all paths follow a Gibbs distribution. We show the equivalence of the Gibbs distribution to a message-passing algorithm by the properties of the Gumbel distribution and give all the ingredients required for variational Bayesian inference of a latent path, namely Bayesian dynamic programming (BDP). We demonstrate the usage of BDP in the latent space of variational autoencoders (VAEs) and propose the BDP-VAE which captures structured sparse optimal paths as latent variables. This enables end-to-end training for generative tasks in which models rely on unobserved structural information. At last, we validate the behavior of our approach and showcase its applicability in two real-world applications: text-to-speech and singing voice synthesis. Our implementation code is available at https://github.com/XinleiNIU/LatentOptimalPathsBayesianDP.
APA
Niu, X., Walder, C., Zhang, J. & Martin, C.P.. (2024). Latent Optimal Paths by Gumbel Propagation for Variational Bayesian Dynamic Programming. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:38316-38343 Available from https://proceedings.mlr.press/v235/niu24b.html.

Related Material