Affinity Weighted Embedding

Jason Weston, Ron Weiss, Hector Yee
; Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1215-1223, 2014.

Abstract

Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-weston14, title = {Affinity Weighted Embedding}, author = {Jason Weston and Ron Weiss and Hector Yee}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1215--1223}, year = {2014}, editor = {Eric P. Xing and Tony Jebara}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/weston14.pdf}, url = {http://proceedings.mlr.press/v32/weston14.html}, abstract = {Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets.} }
Endnote
%0 Conference Paper %T Affinity Weighted Embedding %A Jason Weston %A Ron Weiss %A Hector Yee %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-weston14 %I PMLR %J Proceedings of Machine Learning Research %P 1215--1223 %U http://proceedings.mlr.press %V 32 %N 2 %W PMLR %X Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets.
RIS
TY - CPAPER TI - Affinity Weighted Embedding AU - Jason Weston AU - Ron Weiss AU - Hector Yee BT - Proceedings of the 31st International Conference on Machine Learning PY - 2014/01/27 DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-weston14 PB - PMLR SP - 1215 DP - PMLR EP - 1223 L1 - http://proceedings.mlr.press/v32/weston14.pdf UR - http://proceedings.mlr.press/v32/weston14.html AB - Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets. ER -
APA
Weston, J., Weiss, R. & Yee, H.. (2014). Affinity Weighted Embedding. Proceedings of the 31st International Conference on Machine Learning, in PMLR 32(2):1215-1223

Related Material