A Reparameterization of Mixtures of Truncated Basis Functions and its Applications

Antonio Salmerón, Helge Langseth, Andrés Masegosa, Thomas D. Nielsen
Proceedings of The 11th International Conference on Probabilistic Graphical Models, PMLR 186:205-216, 2022.

Abstract

Mixtures of truncated basis functions (MoTBFs) are a popular tool within the context of hybrid Bayesian networks, mainly because they are compatible with efficient probabilistic inference schemes. However, their standard parameterization allows the presence of negative mixture weights as well as non-normalized mixture terms, which prevents them from benefiting from existing likelihood-based mixture estimation methods like the EM algorithm. Furthermore, the standard parameterization does not facilitate the definition of a Bayesian framework ideally allowing conjugate analysis. In this paper we show how MoTBFs can be reparameterized applying a strategy already used in the literature for Gaussian mixture models with negative terms. We exemplify how the new parameterization is compatible with the EM algorithm and conjugate analysis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v186-salmeron22a, title = {A Reparameterization of Mixtures of Truncated Basis Functions and its Applications}, author = {Salmer{\'o}n, Antonio and Langseth, Helge and Masegosa, Andr{\'e}s and Nielsen, Thomas D.}, booktitle = {Proceedings of The 11th International Conference on Probabilistic Graphical Models}, pages = {205--216}, year = {2022}, editor = {Salmerón, Antonio and Rumı́, Rafael}, volume = {186}, series = {Proceedings of Machine Learning Research}, month = {05--07 Oct}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v186/salmeron22a/salmeron22a.pdf}, url = {https://proceedings.mlr.press/v186/salmeron22a.html}, abstract = {Mixtures of truncated basis functions (MoTBFs) are a popular tool within the context of hybrid Bayesian networks, mainly because they are compatible with efficient probabilistic inference schemes. However, their standard parameterization allows the presence of negative mixture weights as well as non-normalized mixture terms, which prevents them from benefiting from existing likelihood-based mixture estimation methods like the EM algorithm. Furthermore, the standard parameterization does not facilitate the definition of a Bayesian framework ideally allowing conjugate analysis. In this paper we show how MoTBFs can be reparameterized applying a strategy already used in the literature for Gaussian mixture models with negative terms. We exemplify how the new parameterization is compatible with the EM algorithm and conjugate analysis.} }
Endnote
%0 Conference Paper %T A Reparameterization of Mixtures of Truncated Basis Functions and its Applications %A Antonio Salmerón %A Helge Langseth %A Andrés Masegosa %A Thomas D. Nielsen %B Proceedings of The 11th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2022 %E Antonio Salmerón %E Rafael Rumı́ %F pmlr-v186-salmeron22a %I PMLR %P 205--216 %U https://proceedings.mlr.press/v186/salmeron22a.html %V 186 %X Mixtures of truncated basis functions (MoTBFs) are a popular tool within the context of hybrid Bayesian networks, mainly because they are compatible with efficient probabilistic inference schemes. However, their standard parameterization allows the presence of negative mixture weights as well as non-normalized mixture terms, which prevents them from benefiting from existing likelihood-based mixture estimation methods like the EM algorithm. Furthermore, the standard parameterization does not facilitate the definition of a Bayesian framework ideally allowing conjugate analysis. In this paper we show how MoTBFs can be reparameterized applying a strategy already used in the literature for Gaussian mixture models with negative terms. We exemplify how the new parameterization is compatible with the EM algorithm and conjugate analysis.
APA
Salmerón, A., Langseth, H., Masegosa, A. & Nielsen, T.D.. (2022). A Reparameterization of Mixtures of Truncated Basis Functions and its Applications. Proceedings of The 11th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 186:205-216 Available from https://proceedings.mlr.press/v186/salmeron22a.html.

Related Material