Doubly Decomposing Nonparametric Tensor Regression

Masaaki Imaizumi, Kohei Hayashi
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:727-736, 2016.

Abstract

Nonparametric extension of tensor regression is proposed. Nonlinearity in a high-dimensional tensor space is broken into simple local functions by incorporating low-rank tensor decomposition. Compared to naive nonparametric approaches, our formulation considerably improves the convergence rate of estimation while maintaining consistency with the same function class under specific conditions. To estimate local functions, we develop a Bayesian estimator with the Gaussian process prior. Experimental results show its theoretical properties and high performance in terms of predicting a summary statistic of a real complex network.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-imaizumi16, title = {Doubly Decomposing Nonparametric Tensor Regression}, author = {Imaizumi, Masaaki and Hayashi, Kohei}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {727--736}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/imaizumi16.pdf}, url = {https://proceedings.mlr.press/v48/imaizumi16.html}, abstract = {Nonparametric extension of tensor regression is proposed. Nonlinearity in a high-dimensional tensor space is broken into simple local functions by incorporating low-rank tensor decomposition. Compared to naive nonparametric approaches, our formulation considerably improves the convergence rate of estimation while maintaining consistency with the same function class under specific conditions. To estimate local functions, we develop a Bayesian estimator with the Gaussian process prior. Experimental results show its theoretical properties and high performance in terms of predicting a summary statistic of a real complex network.} }
Endnote
%0 Conference Paper %T Doubly Decomposing Nonparametric Tensor Regression %A Masaaki Imaizumi %A Kohei Hayashi %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-imaizumi16 %I PMLR %P 727--736 %U https://proceedings.mlr.press/v48/imaizumi16.html %V 48 %X Nonparametric extension of tensor regression is proposed. Nonlinearity in a high-dimensional tensor space is broken into simple local functions by incorporating low-rank tensor decomposition. Compared to naive nonparametric approaches, our formulation considerably improves the convergence rate of estimation while maintaining consistency with the same function class under specific conditions. To estimate local functions, we develop a Bayesian estimator with the Gaussian process prior. Experimental results show its theoretical properties and high performance in terms of predicting a summary statistic of a real complex network.
RIS
TY - CPAPER TI - Doubly Decomposing Nonparametric Tensor Regression AU - Masaaki Imaizumi AU - Kohei Hayashi BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-imaizumi16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 727 EP - 736 L1 - http://proceedings.mlr.press/v48/imaizumi16.pdf UR - https://proceedings.mlr.press/v48/imaizumi16.html AB - Nonparametric extension of tensor regression is proposed. Nonlinearity in a high-dimensional tensor space is broken into simple local functions by incorporating low-rank tensor decomposition. Compared to naive nonparametric approaches, our formulation considerably improves the convergence rate of estimation while maintaining consistency with the same function class under specific conditions. To estimate local functions, we develop a Bayesian estimator with the Gaussian process prior. Experimental results show its theoretical properties and high performance in terms of predicting a summary statistic of a real complex network. ER -
APA
Imaizumi, M. & Hayashi, K.. (2016). Doubly Decomposing Nonparametric Tensor Regression. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:727-736 Available from https://proceedings.mlr.press/v48/imaizumi16.html.

Related Material