Learning Intra-Batch Connections for Deep Metric Learning

Jenny Denise Seidenschwarz, Ismail Elezi, Laura Leal-Taixé
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:9410-9421, 2021.

Abstract

The goal of metric learning is to learn a function that maps samples to a lower-dimensional space where similar samples lie closer than dissimilar ones. Particularly, deep metric learning utilizes neural networks to learn such a mapping. Most approaches rely on losses that only take the relations between pairs or triplets of samples into account, which either belong to the same class or two different classes. However, these methods do not explore the embedding space in its entirety. To this end, we propose an approach based on message passing networks that takes all the relations in a mini-batch into account. We refine embedding vectors by exchanging messages among all samples in a given batch allowing the training process to be aware of its overall structure. Since not all samples are equally important to predict a decision boundary, we use an attention mechanism during message passing to allow samples to weigh the importance of each neighbor accordingly. We achieve state-of-the-art results on clustering and image retrieval on the CUB-200-2011, Cars196, Stanford Online Products, and In-Shop Clothes datasets. To facilitate further research, we make available the code and the models at https://github.com/dvl-tum/intra_batch_connections.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-seidenschwarz21a, title = {Learning Intra-Batch Connections for Deep Metric Learning}, author = {Seidenschwarz, Jenny Denise and Elezi, Ismail and Leal-Taix{\'e}, Laura}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {9410--9421}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/seidenschwarz21a/seidenschwarz21a.pdf}, url = {https://proceedings.mlr.press/v139/seidenschwarz21a.html}, abstract = {The goal of metric learning is to learn a function that maps samples to a lower-dimensional space where similar samples lie closer than dissimilar ones. Particularly, deep metric learning utilizes neural networks to learn such a mapping. Most approaches rely on losses that only take the relations between pairs or triplets of samples into account, which either belong to the same class or two different classes. However, these methods do not explore the embedding space in its entirety. To this end, we propose an approach based on message passing networks that takes all the relations in a mini-batch into account. We refine embedding vectors by exchanging messages among all samples in a given batch allowing the training process to be aware of its overall structure. Since not all samples are equally important to predict a decision boundary, we use an attention mechanism during message passing to allow samples to weigh the importance of each neighbor accordingly. We achieve state-of-the-art results on clustering and image retrieval on the CUB-200-2011, Cars196, Stanford Online Products, and In-Shop Clothes datasets. To facilitate further research, we make available the code and the models at https://github.com/dvl-tum/intra_batch_connections.} }
Endnote
%0 Conference Paper %T Learning Intra-Batch Connections for Deep Metric Learning %A Jenny Denise Seidenschwarz %A Ismail Elezi %A Laura Leal-Taixé %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-seidenschwarz21a %I PMLR %P 9410--9421 %U https://proceedings.mlr.press/v139/seidenschwarz21a.html %V 139 %X The goal of metric learning is to learn a function that maps samples to a lower-dimensional space where similar samples lie closer than dissimilar ones. Particularly, deep metric learning utilizes neural networks to learn such a mapping. Most approaches rely on losses that only take the relations between pairs or triplets of samples into account, which either belong to the same class or two different classes. However, these methods do not explore the embedding space in its entirety. To this end, we propose an approach based on message passing networks that takes all the relations in a mini-batch into account. We refine embedding vectors by exchanging messages among all samples in a given batch allowing the training process to be aware of its overall structure. Since not all samples are equally important to predict a decision boundary, we use an attention mechanism during message passing to allow samples to weigh the importance of each neighbor accordingly. We achieve state-of-the-art results on clustering and image retrieval on the CUB-200-2011, Cars196, Stanford Online Products, and In-Shop Clothes datasets. To facilitate further research, we make available the code and the models at https://github.com/dvl-tum/intra_batch_connections.
APA
Seidenschwarz, J.D., Elezi, I. & Leal-Taixé, L.. (2021). Learning Intra-Batch Connections for Deep Metric Learning. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:9410-9421 Available from https://proceedings.mlr.press/v139/seidenschwarz21a.html.

Related Material