Distance Metric Learning with Joint Representation Diversification

Xu Chu, Yang Lin, Yasha Wang, Xiting Wang, Hailong Yu, Xin Gao, Qi Tong
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1962-1973, 2020.

Abstract

Distance metric learning (DML) is to learn a representation space equipped with a metric, such that similar examples are closer than dissimilar examples concerning the metric. The recent success of DNNs motivates many DML losses that encourage the intra-class compactness and inter-class separability. The trade-off between inter-class compactness and inter-class separability shapes the DML representation space by determining how much information of the original inputs to retain. In this paper, we propose a Distance Metric Learning with Joint Representation Diversification (JRD) that allows a better balancing point between intra-class compactness and inter-class separability. Specifically, we propose a Joint Representation Similarity regularizer that captures different abstract levels of invariant features and diversifies the joint distributions of representations across multiple layers. Experiments on three deep DML benchmark datasets demonstrate the effectiveness of the proposed approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-chu20a, title = {Distance Metric Learning with Joint Representation Diversification}, author = {Chu, Xu and Lin, Yang and Wang, Yasha and Wang, Xiting and Yu, Hailong and Gao, Xin and Tong, Qi}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1962--1973}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/chu20a/chu20a.pdf}, url = {https://proceedings.mlr.press/v119/chu20a.html}, abstract = {Distance metric learning (DML) is to learn a representation space equipped with a metric, such that similar examples are closer than dissimilar examples concerning the metric. The recent success of DNNs motivates many DML losses that encourage the intra-class compactness and inter-class separability. The trade-off between inter-class compactness and inter-class separability shapes the DML representation space by determining how much information of the original inputs to retain. In this paper, we propose a Distance Metric Learning with Joint Representation Diversification (JRD) that allows a better balancing point between intra-class compactness and inter-class separability. Specifically, we propose a Joint Representation Similarity regularizer that captures different abstract levels of invariant features and diversifies the joint distributions of representations across multiple layers. Experiments on three deep DML benchmark datasets demonstrate the effectiveness of the proposed approach.} }
Endnote
%0 Conference Paper %T Distance Metric Learning with Joint Representation Diversification %A Xu Chu %A Yang Lin %A Yasha Wang %A Xiting Wang %A Hailong Yu %A Xin Gao %A Qi Tong %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-chu20a %I PMLR %P 1962--1973 %U https://proceedings.mlr.press/v119/chu20a.html %V 119 %X Distance metric learning (DML) is to learn a representation space equipped with a metric, such that similar examples are closer than dissimilar examples concerning the metric. The recent success of DNNs motivates many DML losses that encourage the intra-class compactness and inter-class separability. The trade-off between inter-class compactness and inter-class separability shapes the DML representation space by determining how much information of the original inputs to retain. In this paper, we propose a Distance Metric Learning with Joint Representation Diversification (JRD) that allows a better balancing point between intra-class compactness and inter-class separability. Specifically, we propose a Joint Representation Similarity regularizer that captures different abstract levels of invariant features and diversifies the joint distributions of representations across multiple layers. Experiments on three deep DML benchmark datasets demonstrate the effectiveness of the proposed approach.
APA
Chu, X., Lin, Y., Wang, Y., Wang, X., Yu, H., Gao, X. & Tong, Q.. (2020). Distance Metric Learning with Joint Representation Diversification. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1962-1973 Available from https://proceedings.mlr.press/v119/chu20a.html.

Related Material