Large-Margin Metric Learning for Constrained Partitioning Problems

Rémi Lajugie, Francis Bach, Sylvain Arlot
; Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):297-305, 2014.

Abstract

We consider unsupervised partitioning problems based explicitly or implicitly on the minimization of Euclidean distortions, such as clustering, image or video segmentation, and other change-point detection problems. We emphasize on cases with specific structure, which include many practical situations ranging from mean-based change-point detection to image segmentation problems. We aim at learning a Mahalanobis metric for these unsupervised problems, leading to feature weighting and/or selection. This is done in a supervised way by assuming the availability of several (partially) labeled datasets that share the same metric. We cast the metric learning problem as a large-margin structured prediction problem, with proper definition of regularizers and losses, leading to a convex optimization problem which can be solved efficiently. Our experiments show how learning the metric can significantly improve performance on bioinformatics, video or image segmentation problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-lajugie14, title = {Large-Margin Metric Learning for Constrained Partitioning Problems}, author = {Rémi Lajugie and Francis Bach and Sylvain Arlot}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {297--305}, year = {2014}, editor = {Eric P. Xing and Tony Jebara}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/lajugie14.pdf}, url = {http://proceedings.mlr.press/v32/lajugie14.html}, abstract = {We consider unsupervised partitioning problems based explicitly or implicitly on the minimization of Euclidean distortions, such as clustering, image or video segmentation, and other change-point detection problems. We emphasize on cases with specific structure, which include many practical situations ranging from mean-based change-point detection to image segmentation problems. We aim at learning a Mahalanobis metric for these unsupervised problems, leading to feature weighting and/or selection. This is done in a supervised way by assuming the availability of several (partially) labeled datasets that share the same metric. We cast the metric learning problem as a large-margin structured prediction problem, with proper definition of regularizers and losses, leading to a convex optimization problem which can be solved efficiently. Our experiments show how learning the metric can significantly improve performance on bioinformatics, video or image segmentation problems.} }
Endnote
%0 Conference Paper %T Large-Margin Metric Learning for Constrained Partitioning Problems %A Rémi Lajugie %A Francis Bach %A Sylvain Arlot %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-lajugie14 %I PMLR %J Proceedings of Machine Learning Research %P 297--305 %U http://proceedings.mlr.press %V 32 %N 1 %W PMLR %X We consider unsupervised partitioning problems based explicitly or implicitly on the minimization of Euclidean distortions, such as clustering, image or video segmentation, and other change-point detection problems. We emphasize on cases with specific structure, which include many practical situations ranging from mean-based change-point detection to image segmentation problems. We aim at learning a Mahalanobis metric for these unsupervised problems, leading to feature weighting and/or selection. This is done in a supervised way by assuming the availability of several (partially) labeled datasets that share the same metric. We cast the metric learning problem as a large-margin structured prediction problem, with proper definition of regularizers and losses, leading to a convex optimization problem which can be solved efficiently. Our experiments show how learning the metric can significantly improve performance on bioinformatics, video or image segmentation problems.
RIS
TY - CPAPER TI - Large-Margin Metric Learning for Constrained Partitioning Problems AU - Rémi Lajugie AU - Francis Bach AU - Sylvain Arlot BT - Proceedings of the 31st International Conference on Machine Learning PY - 2014/01/27 DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-lajugie14 PB - PMLR SP - 297 DP - PMLR EP - 305 L1 - http://proceedings.mlr.press/v32/lajugie14.pdf UR - http://proceedings.mlr.press/v32/lajugie14.html AB - We consider unsupervised partitioning problems based explicitly or implicitly on the minimization of Euclidean distortions, such as clustering, image or video segmentation, and other change-point detection problems. We emphasize on cases with specific structure, which include many practical situations ranging from mean-based change-point detection to image segmentation problems. We aim at learning a Mahalanobis metric for these unsupervised problems, leading to feature weighting and/or selection. This is done in a supervised way by assuming the availability of several (partially) labeled datasets that share the same metric. We cast the metric learning problem as a large-margin structured prediction problem, with proper definition of regularizers and losses, leading to a convex optimization problem which can be solved efficiently. Our experiments show how learning the metric can significantly improve performance on bioinformatics, video or image segmentation problems. ER -
APA
Lajugie, R., Bach, F. & Arlot, S.. (2014). Large-Margin Metric Learning for Constrained Partitioning Problems. Proceedings of the 31st International Conference on Machine Learning, in PMLR 32(1):297-305

Related Material