RaFM: Rank-Aware Factorization Machines

Xiaoshuang Chen, Yin Zheng, Jiaxing Wang, Wenye Ma, Junzhou Huang
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1132-1140, 2019.

Abstract

Fatorization machines (FM) are a popular model class to learn pairwise interactions by a low-rank approximation. Different from existing FM-based approaches which use a fixed rank for all features, this paper proposes a Rank-Aware FM (RaFM) model which adopts pairwise interactions from embeddings with different ranks. The proposed model achieves a better performance on real-world datasets where different features have significantly varying frequencies of occurrences. Moreover, we prove that the RaFM model can be stored, evaluated, and trained as efficiently as one single FM, and under some reasonable conditions it can be even significantly more efficient than FM. RaFM improves the performance of FMs in both regression tasks and classification tasks while incurring less computational burden, therefore also has attractive potential in industrial applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-chen19n, title = {{R}a{FM}: Rank-Aware Factorization Machines}, author = {Chen, Xiaoshuang and Zheng, Yin and Wang, Jiaxing and Ma, Wenye and Huang, Junzhou}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {1132--1140}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/chen19n/chen19n.pdf}, url = {https://proceedings.mlr.press/v97/chen19n.html}, abstract = {Fatorization machines (FM) are a popular model class to learn pairwise interactions by a low-rank approximation. Different from existing FM-based approaches which use a fixed rank for all features, this paper proposes a Rank-Aware FM (RaFM) model which adopts pairwise interactions from embeddings with different ranks. The proposed model achieves a better performance on real-world datasets where different features have significantly varying frequencies of occurrences. Moreover, we prove that the RaFM model can be stored, evaluated, and trained as efficiently as one single FM, and under some reasonable conditions it can be even significantly more efficient than FM. RaFM improves the performance of FMs in both regression tasks and classification tasks while incurring less computational burden, therefore also has attractive potential in industrial applications.} }
Endnote
%0 Conference Paper %T RaFM: Rank-Aware Factorization Machines %A Xiaoshuang Chen %A Yin Zheng %A Jiaxing Wang %A Wenye Ma %A Junzhou Huang %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-chen19n %I PMLR %P 1132--1140 %U https://proceedings.mlr.press/v97/chen19n.html %V 97 %X Fatorization machines (FM) are a popular model class to learn pairwise interactions by a low-rank approximation. Different from existing FM-based approaches which use a fixed rank for all features, this paper proposes a Rank-Aware FM (RaFM) model which adopts pairwise interactions from embeddings with different ranks. The proposed model achieves a better performance on real-world datasets where different features have significantly varying frequencies of occurrences. Moreover, we prove that the RaFM model can be stored, evaluated, and trained as efficiently as one single FM, and under some reasonable conditions it can be even significantly more efficient than FM. RaFM improves the performance of FMs in both regression tasks and classification tasks while incurring less computational burden, therefore also has attractive potential in industrial applications.
APA
Chen, X., Zheng, Y., Wang, J., Ma, W. & Huang, J.. (2019). RaFM: Rank-Aware Factorization Machines. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:1132-1140 Available from https://proceedings.mlr.press/v97/chen19n.html.

Related Material