Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models

Dilin Wang, Qiang Liu
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6576-6585, 2019.

Abstract

Diversification has been shown to be a powerful mechanism for learning robust models in non-convex settings. A notable example is learning mixture models, in which enforcing diversity between the different mixture components allows us to prevent the model collapsing phenomenon and capture more patterns from the observed data. In this work, we present a variational approach for diversity-promoting learning, which leverages the entropy functional as a natural mechanism for enforcing diversity. We develop a simple and efficient functional gradient-based algorithm for optimizing the variational objective function, which provides a significant generalization of Stein variational gradient descent (SVGD). We test our method on various challenging real world problems, including deep embedded clustering and deep anomaly detection. Empirical results show that our method provides an effective mechanism for diversity-promoting learning, achieving substantial improvement over existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-wang19h, title = {Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models}, author = {Wang, Dilin and Liu, Qiang}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6576--6585}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/wang19h/wang19h.pdf}, url = {https://proceedings.mlr.press/v97/wang19h.html}, abstract = {Diversification has been shown to be a powerful mechanism for learning robust models in non-convex settings. A notable example is learning mixture models, in which enforcing diversity between the different mixture components allows us to prevent the model collapsing phenomenon and capture more patterns from the observed data. In this work, we present a variational approach for diversity-promoting learning, which leverages the entropy functional as a natural mechanism for enforcing diversity. We develop a simple and efficient functional gradient-based algorithm for optimizing the variational objective function, which provides a significant generalization of Stein variational gradient descent (SVGD). We test our method on various challenging real world problems, including deep embedded clustering and deep anomaly detection. Empirical results show that our method provides an effective mechanism for diversity-promoting learning, achieving substantial improvement over existing methods.} }
Endnote
%0 Conference Paper %T Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models %A Dilin Wang %A Qiang Liu %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-wang19h %I PMLR %P 6576--6585 %U https://proceedings.mlr.press/v97/wang19h.html %V 97 %X Diversification has been shown to be a powerful mechanism for learning robust models in non-convex settings. A notable example is learning mixture models, in which enforcing diversity between the different mixture components allows us to prevent the model collapsing phenomenon and capture more patterns from the observed data. In this work, we present a variational approach for diversity-promoting learning, which leverages the entropy functional as a natural mechanism for enforcing diversity. We develop a simple and efficient functional gradient-based algorithm for optimizing the variational objective function, which provides a significant generalization of Stein variational gradient descent (SVGD). We test our method on various challenging real world problems, including deep embedded clustering and deep anomaly detection. Empirical results show that our method provides an effective mechanism for diversity-promoting learning, achieving substantial improvement over existing methods.
APA
Wang, D. & Liu, Q.. (2019). Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6576-6585 Available from https://proceedings.mlr.press/v97/wang19h.html.

Related Material