Online matrix prediction for sparse loss matrices

Ken-ichiro Moridomi, Kohei Hatano, Eiji Takimoto, Koji Tsuda
; Proceedings of the Sixth Asian Conference on Machine Learning, PMLR 39:250-265, 2015.

Abstract

We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v39-moridomi14, title = {Online matrix prediction for sparse loss matrices}, author = {Ken-ichiro Moridomi and Kohei Hatano and Eiji Takimoto and Koji Tsuda}, booktitle = {Proceedings of the Sixth Asian Conference on Machine Learning}, pages = {250--265}, year = {2015}, editor = {Dinh Phung and Hang Li}, volume = {39}, series = {Proceedings of Machine Learning Research}, address = {Nha Trang City, Vietnam}, month = {26--28 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v39/moridomi14.pdf}, url = {http://proceedings.mlr.press/v39/moridomi14.html}, abstract = {We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization.} }
Endnote
%0 Conference Paper %T Online matrix prediction for sparse loss matrices %A Ken-ichiro Moridomi %A Kohei Hatano %A Eiji Takimoto %A Koji Tsuda %B Proceedings of the Sixth Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Dinh Phung %E Hang Li %F pmlr-v39-moridomi14 %I PMLR %J Proceedings of Machine Learning Research %P 250--265 %U http://proceedings.mlr.press %V 39 %W PMLR %X We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization.
RIS
TY - CPAPER TI - Online matrix prediction for sparse loss matrices AU - Ken-ichiro Moridomi AU - Kohei Hatano AU - Eiji Takimoto AU - Koji Tsuda BT - Proceedings of the Sixth Asian Conference on Machine Learning PY - 2015/02/16 DA - 2015/02/16 ED - Dinh Phung ED - Hang Li ID - pmlr-v39-moridomi14 PB - PMLR SP - 250 DP - PMLR EP - 265 L1 - http://proceedings.mlr.press/v39/moridomi14.pdf UR - http://proceedings.mlr.press/v39/moridomi14.html AB - We consider an online matrix prediction problem. The FTRL is a famous method to deal with online prediction task, which makes prediction by minimizing cumulative loss function and regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, quantum relative entropy and log-determinant. We propose a FTRL based algorithm with log-determinant as regularizer and show regret bound of algorithm. Our main contribution is to show that log-determinant regularization is efficient when sparse loss function setting. We also show the optimal performance algorithm for online collaborative filtering problem with log-determinant regularization. ER -
APA
Moridomi, K., Hatano, K., Takimoto, E. & Tsuda, K.. (2015). Online matrix prediction for sparse loss matrices. Proceedings of the Sixth Asian Conference on Machine Learning, in PMLR 39:250-265

Related Material