A Unified Weight Learning Paradigm for Multi-view Learning

Lai Tian, Feiping Nie, Xuelong Li
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2790-2800, 2019.

Abstract

Learning a set of weights to combine views linearly forms a series of popular schemes in multi-view learning. Three weight learning paradigms, i.e., Norm Regularization (NR), Exponential Decay (ED), and p-th Root Loss (pRL), are widely used in the literature, while the relations between them and the limiting behaviors of them are not well understood yet. In this paper, we present a Unified Paradigm (UP) that contains the aforementioned three popular paradigms as special cases. Specifically, we extend the domain of hyper-parameters of NR from positive to real numbers and show this extension bridges NR, ED, and pRL. Besides, we provide detailed discussion on the weights sparsity, hyper-parameter setting, and counterintuitive limiting behavior of these paradigms. Furthermore, we show the generality of our technique with examples in Multi-Task Learning and Fuzzy Clustering. Our results may provide insights to understand existing algorithms better and inspire research on new weight learning schemes. Numerical results support our theoretical analysis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-tian19a, title = {A Unified Weight Learning Paradigm for Multi-view Learning}, author = {Tian, Lai and Nie, Feiping and Li, Xuelong}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2790--2800}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/tian19a/tian19a.pdf}, url = {https://proceedings.mlr.press/v89/tian19a.html}, abstract = {Learning a set of weights to combine views linearly forms a series of popular schemes in multi-view learning. Three weight learning paradigms, i.e., Norm Regularization (NR), Exponential Decay (ED), and p-th Root Loss (pRL), are widely used in the literature, while the relations between them and the limiting behaviors of them are not well understood yet. In this paper, we present a Unified Paradigm (UP) that contains the aforementioned three popular paradigms as special cases. Specifically, we extend the domain of hyper-parameters of NR from positive to real numbers and show this extension bridges NR, ED, and pRL. Besides, we provide detailed discussion on the weights sparsity, hyper-parameter setting, and counterintuitive limiting behavior of these paradigms. Furthermore, we show the generality of our technique with examples in Multi-Task Learning and Fuzzy Clustering. Our results may provide insights to understand existing algorithms better and inspire research on new weight learning schemes. Numerical results support our theoretical analysis.} }
Endnote
%0 Conference Paper %T A Unified Weight Learning Paradigm for Multi-view Learning %A Lai Tian %A Feiping Nie %A Xuelong Li %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-tian19a %I PMLR %P 2790--2800 %U https://proceedings.mlr.press/v89/tian19a.html %V 89 %X Learning a set of weights to combine views linearly forms a series of popular schemes in multi-view learning. Three weight learning paradigms, i.e., Norm Regularization (NR), Exponential Decay (ED), and p-th Root Loss (pRL), are widely used in the literature, while the relations between them and the limiting behaviors of them are not well understood yet. In this paper, we present a Unified Paradigm (UP) that contains the aforementioned three popular paradigms as special cases. Specifically, we extend the domain of hyper-parameters of NR from positive to real numbers and show this extension bridges NR, ED, and pRL. Besides, we provide detailed discussion on the weights sparsity, hyper-parameter setting, and counterintuitive limiting behavior of these paradigms. Furthermore, we show the generality of our technique with examples in Multi-Task Learning and Fuzzy Clustering. Our results may provide insights to understand existing algorithms better and inspire research on new weight learning schemes. Numerical results support our theoretical analysis.
APA
Tian, L., Nie, F. & Li, X.. (2019). A Unified Weight Learning Paradigm for Multi-view Learning. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2790-2800 Available from https://proceedings.mlr.press/v89/tian19a.html.

Related Material