Wasserstein Gradient Flow over Variational Parameter Space for Variational Inference

Dai Hai Nguyen, Tetsuya Sakurai, Hiroshi Mamitsuka
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1756-1764, 2025.

Abstract

Variational Inference (VI) optimizes varia- tional parameters to closely align a variational distribution with the true posterior, being ap- proached through vanilla gradient descent in black-box VI or natural-gradient descent in natural-gradient VI. In this work, we reframe VI as the optimization of an objective that concerns probability distributions defined over a variational parameter space. Subsequently, we propose Wasserstein gradient descent for solving this optimization, where black-box VI and natural-gradient VI can be interpreted as special cases of the proposed Wasserstein gradient descent. To enhance the efficiency of optimization, we develop practical methods for numerically solving the discrete gradient flows. We validate the effectiveness of the pro- posed methods through experiments on syn- thetic and real-world datasets, supplemented by theoretical analyses.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-nguyen25d, title = {Wasserstein Gradient Flow over Variational Parameter Space for Variational Inference}, author = {Nguyen, Dai Hai and Sakurai, Tetsuya and Mamitsuka, Hiroshi}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1756--1764}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/nguyen25d/nguyen25d.pdf}, url = {https://proceedings.mlr.press/v258/nguyen25d.html}, abstract = {Variational Inference (VI) optimizes varia- tional parameters to closely align a variational distribution with the true posterior, being ap- proached through vanilla gradient descent in black-box VI or natural-gradient descent in natural-gradient VI. In this work, we reframe VI as the optimization of an objective that concerns probability distributions defined over a variational parameter space. Subsequently, we propose Wasserstein gradient descent for solving this optimization, where black-box VI and natural-gradient VI can be interpreted as special cases of the proposed Wasserstein gradient descent. To enhance the efficiency of optimization, we develop practical methods for numerically solving the discrete gradient flows. We validate the effectiveness of the pro- posed methods through experiments on syn- thetic and real-world datasets, supplemented by theoretical analyses.} }
Endnote
%0 Conference Paper %T Wasserstein Gradient Flow over Variational Parameter Space for Variational Inference %A Dai Hai Nguyen %A Tetsuya Sakurai %A Hiroshi Mamitsuka %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-nguyen25d %I PMLR %P 1756--1764 %U https://proceedings.mlr.press/v258/nguyen25d.html %V 258 %X Variational Inference (VI) optimizes varia- tional parameters to closely align a variational distribution with the true posterior, being ap- proached through vanilla gradient descent in black-box VI or natural-gradient descent in natural-gradient VI. In this work, we reframe VI as the optimization of an objective that concerns probability distributions defined over a variational parameter space. Subsequently, we propose Wasserstein gradient descent for solving this optimization, where black-box VI and natural-gradient VI can be interpreted as special cases of the proposed Wasserstein gradient descent. To enhance the efficiency of optimization, we develop practical methods for numerically solving the discrete gradient flows. We validate the effectiveness of the pro- posed methods through experiments on syn- thetic and real-world datasets, supplemented by theoretical analyses.
APA
Nguyen, D.H., Sakurai, T. & Mamitsuka, H.. (2025). Wasserstein Gradient Flow over Variational Parameter Space for Variational Inference. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1756-1764 Available from https://proceedings.mlr.press/v258/nguyen25d.html.

Related Material