Provable Smoothness Guarantees for Black-Box Variational Inference

Justin Domke
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2587-2596, 2020.

Abstract

Black-box variational inference tries to approximate a complex target distribution through a gradient-based optimization of the parameters of a simpler distribution. Provable convergence guarantees require structural properties of the objective. This paper shows that for location-scale family approximations, if the target is M-Lipschitz smooth, then so is the “energy” part of the variational objective. The key proof idea is to describe gradients in a certain inner-product space, thus permitting the use of Bessel’s inequality. This result gives bounds on the location of the optimal parameters, and is a key ingredient for convergence guarantees.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-domke20a, title = {Provable Smoothness Guarantees for Black-Box Variational Inference}, author = {Domke, Justin}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2587--2596}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/domke20a/domke20a.pdf}, url = { http://proceedings.mlr.press/v119/domke20a.html }, abstract = {Black-box variational inference tries to approximate a complex target distribution through a gradient-based optimization of the parameters of a simpler distribution. Provable convergence guarantees require structural properties of the objective. This paper shows that for location-scale family approximations, if the target is M-Lipschitz smooth, then so is the “energy” part of the variational objective. The key proof idea is to describe gradients in a certain inner-product space, thus permitting the use of Bessel’s inequality. This result gives bounds on the location of the optimal parameters, and is a key ingredient for convergence guarantees.} }
Endnote
%0 Conference Paper %T Provable Smoothness Guarantees for Black-Box Variational Inference %A Justin Domke %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-domke20a %I PMLR %P 2587--2596 %U http://proceedings.mlr.press/v119/domke20a.html %V 119 %X Black-box variational inference tries to approximate a complex target distribution through a gradient-based optimization of the parameters of a simpler distribution. Provable convergence guarantees require structural properties of the objective. This paper shows that for location-scale family approximations, if the target is M-Lipschitz smooth, then so is the “energy” part of the variational objective. The key proof idea is to describe gradients in a certain inner-product space, thus permitting the use of Bessel’s inequality. This result gives bounds on the location of the optimal parameters, and is a key ingredient for convergence guarantees.
APA
Domke, J.. (2020). Provable Smoothness Guarantees for Black-Box Variational Inference. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2587-2596 Available from http://proceedings.mlr.press/v119/domke20a.html .

Related Material