Max-Margin Multiple-Instance Dictionary Learning

Xinggang Wang, Baoyuan Wang, Xiang Bai, Wenyu Liu, Zhuowen Tu
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):846-854, 2013.

Abstract

Dictionary learning has became an increasingly important task in machine learning, as it is fundamental to the representation problem. A number of emerging techniques specifically include a codebook learning step, in which a critical knowledge abstraction process is carried out. Existing approaches in dictionary (codebook) learning are either generative (unsupervised e.g. k-means) or discriminative (supervised e.g. extremely randomized forests). In this paper, we propose a multiple instance learning (MIL) strategy (along the line of weakly supervised learning) for dictionary learning. Each code is represented by a classifier, such as a linear SVM, which naturally performs metric fusion for multi-channel features. We design a formulation to simultaneously learn mixtures of codes by maximizing classification margins in MIL. State-of-the-art results are observed in image classification benchmarks based on the learned codebooks, which observe both compactness and effectiveness.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-wang13d, title = {Max-Margin Multiple-Instance Dictionary Learning}, author = {Wang, Xinggang and Wang, Baoyuan and Bai, Xiang and Liu, Wenyu and Tu, Zhuowen}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {846--854}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/wang13d.pdf}, url = {https://proceedings.mlr.press/v28/wang13d.html}, abstract = {Dictionary learning has became an increasingly important task in machine learning, as it is fundamental to the representation problem. A number of emerging techniques specifically include a codebook learning step, in which a critical knowledge abstraction process is carried out. Existing approaches in dictionary (codebook) learning are either generative (unsupervised e.g. k-means) or discriminative (supervised e.g. extremely randomized forests). In this paper, we propose a multiple instance learning (MIL) strategy (along the line of weakly supervised learning) for dictionary learning. Each code is represented by a classifier, such as a linear SVM, which naturally performs metric fusion for multi-channel features. We design a formulation to simultaneously learn mixtures of codes by maximizing classification margins in MIL. State-of-the-art results are observed in image classification benchmarks based on the learned codebooks, which observe both compactness and effectiveness.} }
Endnote
%0 Conference Paper %T Max-Margin Multiple-Instance Dictionary Learning %A Xinggang Wang %A Baoyuan Wang %A Xiang Bai %A Wenyu Liu %A Zhuowen Tu %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-wang13d %I PMLR %P 846--854 %U https://proceedings.mlr.press/v28/wang13d.html %V 28 %N 3 %X Dictionary learning has became an increasingly important task in machine learning, as it is fundamental to the representation problem. A number of emerging techniques specifically include a codebook learning step, in which a critical knowledge abstraction process is carried out. Existing approaches in dictionary (codebook) learning are either generative (unsupervised e.g. k-means) or discriminative (supervised e.g. extremely randomized forests). In this paper, we propose a multiple instance learning (MIL) strategy (along the line of weakly supervised learning) for dictionary learning. Each code is represented by a classifier, such as a linear SVM, which naturally performs metric fusion for multi-channel features. We design a formulation to simultaneously learn mixtures of codes by maximizing classification margins in MIL. State-of-the-art results are observed in image classification benchmarks based on the learned codebooks, which observe both compactness and effectiveness.
RIS
TY - CPAPER TI - Max-Margin Multiple-Instance Dictionary Learning AU - Xinggang Wang AU - Baoyuan Wang AU - Xiang Bai AU - Wenyu Liu AU - Zhuowen Tu BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-wang13d PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 846 EP - 854 L1 - http://proceedings.mlr.press/v28/wang13d.pdf UR - https://proceedings.mlr.press/v28/wang13d.html AB - Dictionary learning has became an increasingly important task in machine learning, as it is fundamental to the representation problem. A number of emerging techniques specifically include a codebook learning step, in which a critical knowledge abstraction process is carried out. Existing approaches in dictionary (codebook) learning are either generative (unsupervised e.g. k-means) or discriminative (supervised e.g. extremely randomized forests). In this paper, we propose a multiple instance learning (MIL) strategy (along the line of weakly supervised learning) for dictionary learning. Each code is represented by a classifier, such as a linear SVM, which naturally performs metric fusion for multi-channel features. We design a formulation to simultaneously learn mixtures of codes by maximizing classification margins in MIL. State-of-the-art results are observed in image classification benchmarks based on the learned codebooks, which observe both compactness and effectiveness. ER -
APA
Wang, X., Wang, B., Bai, X., Liu, W. & Tu, Z.. (2013). Max-Margin Multiple-Instance Dictionary Learning. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):846-854 Available from https://proceedings.mlr.press/v28/wang13d.html.

Related Material