Learning Local Invariant Mahalanobis Distances

Ethan Fetaya, Shimon Ullman
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:162-168, 2015.

Abstract

For many tasks and data types, there are natural transformations to which the data should be invariant or insensitive. For instance, in visual recognition, natural images should be insensitive to rotation and translation. This requirement and its implications have been important in many machine learning applications, and tolerance for image transformations was primarily achieved by using robust feature vectors. In this paper we propose a novel and computationally efficient way to learn a local Mahalanobis metric per datum, and show how we can learn a local invariant metric to any transformation in order to improve performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-fetaya15, title = {Learning Local Invariant Mahalanobis Distances}, author = {Fetaya, Ethan and Ullman, Shimon}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {162--168}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/fetaya15.pdf}, url = {https://proceedings.mlr.press/v37/fetaya15.html}, abstract = {For many tasks and data types, there are natural transformations to which the data should be invariant or insensitive. For instance, in visual recognition, natural images should be insensitive to rotation and translation. This requirement and its implications have been important in many machine learning applications, and tolerance for image transformations was primarily achieved by using robust feature vectors. In this paper we propose a novel and computationally efficient way to learn a local Mahalanobis metric per datum, and show how we can learn a local invariant metric to any transformation in order to improve performance.} }
Endnote
%0 Conference Paper %T Learning Local Invariant Mahalanobis Distances %A Ethan Fetaya %A Shimon Ullman %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-fetaya15 %I PMLR %P 162--168 %U https://proceedings.mlr.press/v37/fetaya15.html %V 37 %X For many tasks and data types, there are natural transformations to which the data should be invariant or insensitive. For instance, in visual recognition, natural images should be insensitive to rotation and translation. This requirement and its implications have been important in many machine learning applications, and tolerance for image transformations was primarily achieved by using robust feature vectors. In this paper we propose a novel and computationally efficient way to learn a local Mahalanobis metric per datum, and show how we can learn a local invariant metric to any transformation in order to improve performance.
RIS
TY - CPAPER TI - Learning Local Invariant Mahalanobis Distances AU - Ethan Fetaya AU - Shimon Ullman BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-fetaya15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 162 EP - 168 L1 - http://proceedings.mlr.press/v37/fetaya15.pdf UR - https://proceedings.mlr.press/v37/fetaya15.html AB - For many tasks and data types, there are natural transformations to which the data should be invariant or insensitive. For instance, in visual recognition, natural images should be insensitive to rotation and translation. This requirement and its implications have been important in many machine learning applications, and tolerance for image transformations was primarily achieved by using robust feature vectors. In this paper we propose a novel and computationally efficient way to learn a local Mahalanobis metric per datum, and show how we can learn a local invariant metric to any transformation in order to improve performance. ER -
APA
Fetaya, E. & Ullman, S.. (2015). Learning Local Invariant Mahalanobis Distances. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:162-168 Available from https://proceedings.mlr.press/v37/fetaya15.html.

Related Material