Handling the Positive-Definite Constraint in the Bayesian Learning Rule

Wu Lin, Mark Schmidt, Mohammad Emtiyaz Khan
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6116-6126, 2020.

Abstract

The Bayesian learning rule is a natural-gradient variational inference method, which not only contains many existing learning algorithms as special cases but also enables the design of new algorithms. Unfortunately, when variational parameters lie in an open constraint set, the rule may not satisfy the constraint and requires line-searches which could slow down the algorithm. In this work, we address this issue for positive-definite constraints by proposing an improved rule that naturally handles the constraints. Our modification is obtained by using Riemannian gradient methods, and is valid when the approximation attains a block-coordinate natural parameterization (e.g., Gaussian distributions and their mixtures). Our method outperforms existing methods without any significant increase in computation. Our work makes it easier to apply the rule in the presence of positive-definite constraints in parameter spaces.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-lin20d, title = {Handling the Positive-Definite Constraint in the {B}ayesian Learning Rule}, author = {Lin, Wu and Schmidt, Mark and Khan, Mohammad Emtiyaz}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6116--6126}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/lin20d/lin20d.pdf}, url = {https://proceedings.mlr.press/v119/lin20d.html}, abstract = {The Bayesian learning rule is a natural-gradient variational inference method, which not only contains many existing learning algorithms as special cases but also enables the design of new algorithms. Unfortunately, when variational parameters lie in an open constraint set, the rule may not satisfy the constraint and requires line-searches which could slow down the algorithm. In this work, we address this issue for positive-definite constraints by proposing an improved rule that naturally handles the constraints. Our modification is obtained by using Riemannian gradient methods, and is valid when the approximation attains a block-coordinate natural parameterization (e.g., Gaussian distributions and their mixtures). Our method outperforms existing methods without any significant increase in computation. Our work makes it easier to apply the rule in the presence of positive-definite constraints in parameter spaces.} }
Endnote
%0 Conference Paper %T Handling the Positive-Definite Constraint in the Bayesian Learning Rule %A Wu Lin %A Mark Schmidt %A Mohammad Emtiyaz Khan %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-lin20d %I PMLR %P 6116--6126 %U https://proceedings.mlr.press/v119/lin20d.html %V 119 %X The Bayesian learning rule is a natural-gradient variational inference method, which not only contains many existing learning algorithms as special cases but also enables the design of new algorithms. Unfortunately, when variational parameters lie in an open constraint set, the rule may not satisfy the constraint and requires line-searches which could slow down the algorithm. In this work, we address this issue for positive-definite constraints by proposing an improved rule that naturally handles the constraints. Our modification is obtained by using Riemannian gradient methods, and is valid when the approximation attains a block-coordinate natural parameterization (e.g., Gaussian distributions and their mixtures). Our method outperforms existing methods without any significant increase in computation. Our work makes it easier to apply the rule in the presence of positive-definite constraints in parameter spaces.
APA
Lin, W., Schmidt, M. & Khan, M.E.. (2020). Handling the Positive-Definite Constraint in the Bayesian Learning Rule. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6116-6126 Available from https://proceedings.mlr.press/v119/lin20d.html.

Related Material