Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization

Svetlana Lazebnik, Maxim Raginsky
; Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:251-258, 2007.

Abstract

Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-lazebnik07a, title = {Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization}, author = {Svetlana Lazebnik and Maxim Raginsky}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {251--258}, year = {2007}, editor = {Marina Meila and Xiaotong Shen}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/lazebnik07a/lazebnik07a.pdf}, url = {http://proceedings.mlr.press/v2/lazebnik07a.html}, abstract = {Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings.} }
Endnote
%0 Conference Paper %T Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization %A Svetlana Lazebnik %A Maxim Raginsky %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-lazebnik07a %I PMLR %J Proceedings of Machine Learning Research %P 251--258 %U http://proceedings.mlr.press %V 2 %W PMLR %X Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings.
RIS
TY - CPAPER TI - Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization AU - Svetlana Lazebnik AU - Maxim Raginsky BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics PY - 2007/03/11 DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-lazebnik07a PB - PMLR SP - 251 DP - PMLR EP - 258 L1 - http://proceedings.mlr.press/v2/lazebnik07a/lazebnik07a.pdf UR - http://proceedings.mlr.press/v2/lazebnik07a.html AB - Markov Random Fields (MRFs) are used in a large array of computer vision and maching learning applications. Finding the Maximum Aposteriori (MAP) solution of an MRF is in general intractable, and one has to resort to approximate solutions, such as Belief Propagation, Graph Cuts, or more recently, approaches based on quadratic programming. We propose a novel type of approximation, Spectral relaxation to Quadratic Programming (SQP). We show our method offers tighter bounds than recently published work, while at the same time being computationally efficient. We compare our method to other algorithms on random MRFs in various settings. ER -
APA
Lazebnik, S. & Raginsky, M.. (2007). Learning Nearest-Neighbor Quantizers from Labeled Data by Information Loss Minimization. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in PMLR 2:251-258

Related Material