Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization

Svetlana Lazebnik, Maxim Raginsky
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:251-258, 2007.

Abstract

Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-lazebnik07a, title = {Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization}, author = {Lazebnik, Svetlana and Raginsky, Maxim}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {251--258}, year = {2007}, editor = {Meila, Marina and Shen, Xiaotong}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/lazebnik07a/lazebnik07a.pdf}, url = {https://proceedings.mlr.press/v2/lazebnik07a.html}, abstract = {Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings.} }
Endnote
%0 Conference Paper %T Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization %A Svetlana Lazebnik %A Maxim Raginsky %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-lazebnik07a %I PMLR %P 251--258 %U https://proceedings.mlr.press/v2/lazebnik07a.html %V 2 %X Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings.
RIS
TY - CPAPER TI - Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization AU - Svetlana Lazebnik AU - Maxim Raginsky BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-lazebnik07a PB - PMLR DP - Proceedings of Machine Learning Research VL - 2 SP - 251 EP - 258 L1 - http://proceedings.mlr.press/v2/lazebnik07a/lazebnik07a.pdf UR - https://proceedings.mlr.press/v2/lazebnik07a.html AB - Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings. ER -
APA
Lazebnik, S. & Raginsky, M.. (2007). Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 2:251-258 Available from https://proceedings.mlr.press/v2/lazebnik07a.html.

Related Material