Online matrix prediction for sparse loss matrices

Ken-ichiro Moridomi, Kohei Hatano, Eiji Takimoto, Koji Tsuda
Proceedings of the Sixth Asian Conference on Machine Learning, PMLR 39:250-265, 2015.

Abstract

We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v39-moridomi14, title = {Online matrix prediction for sparse loss matrices}, author = {Moridomi, Ken-ichiro and Hatano, Kohei and Takimoto, Eiji and Tsuda, Koji}, booktitle = {Proceedings of the Sixth Asian Conference on Machine Learning}, pages = {250--265}, year = {2015}, editor = {Phung, Dinh and Li, Hang}, volume = {39}, series = {Proceedings of Machine Learning Research}, address = {Nha Trang City, Vietnam}, month = {26--28 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v39/moridomi14.pdf}, url = {https://proceedings.mlr.press/v39/moridomi14.html}, abstract = {We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization.} }
Endnote
%0 Conference Paper %T Online matrix prediction for sparse loss matrices %A Ken-ichiro Moridomi %A Kohei Hatano %A Eiji Takimoto %A Koji Tsuda %B Proceedings of the Sixth Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Dinh Phung %E Hang Li %F pmlr-v39-moridomi14 %I PMLR %P 250--265 %U https://proceedings.mlr.press/v39/moridomi14.html %V 39 %X We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization.
RIS
TY - CPAPER TI - Online matrix prediction for sparse loss matrices AU - Ken-ichiro Moridomi AU - Kohei Hatano AU - Eiji Takimoto AU - Koji Tsuda BT - Proceedings of the Sixth Asian Conference on Machine Learning DA - 2015/02/16 ED - Dinh Phung ED - Hang Li ID - pmlr-v39-moridomi14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 39 SP - 250 EP - 265 L1 - http://proceedings.mlr.press/v39/moridomi14.pdf UR - https://proceedings.mlr.press/v39/moridomi14.html AB - We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization. ER -
APA
Moridomi, K., Hatano, K., Takimoto, E. & Tsuda, K.. (2015). Online matrix prediction for sparse loss matrices. Proceedings of the Sixth Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 39:250-265 Available from https://proceedings.mlr.press/v39/moridomi14.html.

Related Material