Fast Variational Mode-Seeking

Bo Thiesson, Jingu Kim
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:1230-1242, 2012.

Abstract

Mode-seeking algorithms (e.g., mean-shift) constitute a class of powerful non-parametric clustering methods, but they are slow. We present VMS, a dual-tree based variational EM framework for mode-seeking that greatly accelerates performance. VMS has a number of pleasing properties: it generalizes across different mode-seeking algorithms, it does not have typical homoscedasticity constraints on kernel bandwidths, and it is the first truly sub-quadratic acceleration method that maintains provable convergence for a well-defined objective function. Experimental results demonstrate acceleration benefits over competing methods and show that VMS is particularly desirable for data sets of massive size, where a coarser approximation is needed to improve the computational efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-thiesson12, title = {Fast Variational Mode-Seeking}, author = {Thiesson, Bo and Kim, Jingu}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {1230--1242}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/thiesson12/thiesson12.pdf}, url = {https://proceedings.mlr.press/v22/thiesson12.html}, abstract = {Mode-seeking algorithms (e.g., mean-shift) constitute a class of powerful non-parametric clustering methods, but they are slow. We present VMS, a dual-tree based variational EM framework for mode-seeking that greatly accelerates performance. VMS has a number of pleasing properties: it generalizes across different mode-seeking algorithms, it does not have typical homoscedasticity constraints on kernel bandwidths, and it is the first truly sub-quadratic acceleration method that maintains provable convergence for a well-defined objective function. Experimental results demonstrate acceleration benefits over competing methods and show that VMS is particularly desirable for data sets of massive size, where a coarser approximation is needed to improve the computational efficiency.} }
Endnote
%0 Conference Paper %T Fast Variational Mode-Seeking %A Bo Thiesson %A Jingu Kim %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-thiesson12 %I PMLR %P 1230--1242 %U https://proceedings.mlr.press/v22/thiesson12.html %V 22 %X Mode-seeking algorithms (e.g., mean-shift) constitute a class of powerful non-parametric clustering methods, but they are slow. We present VMS, a dual-tree based variational EM framework for mode-seeking that greatly accelerates performance. VMS has a number of pleasing properties: it generalizes across different mode-seeking algorithms, it does not have typical homoscedasticity constraints on kernel bandwidths, and it is the first truly sub-quadratic acceleration method that maintains provable convergence for a well-defined objective function. Experimental results demonstrate acceleration benefits over competing methods and show that VMS is particularly desirable for data sets of massive size, where a coarser approximation is needed to improve the computational efficiency.
RIS
TY - CPAPER TI - Fast Variational Mode-Seeking AU - Bo Thiesson AU - Jingu Kim BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-thiesson12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 1230 EP - 1242 L1 - http://proceedings.mlr.press/v22/thiesson12/thiesson12.pdf UR - https://proceedings.mlr.press/v22/thiesson12.html AB - Mode-seeking algorithms (e.g., mean-shift) constitute a class of powerful non-parametric clustering methods, but they are slow. We present VMS, a dual-tree based variational EM framework for mode-seeking that greatly accelerates performance. VMS has a number of pleasing properties: it generalizes across different mode-seeking algorithms, it does not have typical homoscedasticity constraints on kernel bandwidths, and it is the first truly sub-quadratic acceleration method that maintains provable convergence for a well-defined objective function. Experimental results demonstrate acceleration benefits over competing methods and show that VMS is particularly desirable for data sets of massive size, where a coarser approximation is needed to improve the computational efficiency. ER -
APA
Thiesson, B. & Kim, J.. (2012). Fast Variational Mode-Seeking. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:1230-1242 Available from https://proceedings.mlr.press/v22/thiesson12.html.

Related Material