PCA-based Multi-Task Learning: a Random Matrix Approach

Malik Tiomoko, Romain Couillet, Frederic Pascal
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:34280-34300, 2023.

Abstract

The article proposes and theoretically analyses a computationally efficient multi-task learning (MTL) extension of popular principal component analysis (PCA)-based supervised learning schemes. The analysis reveals that (i) by default, learning may dramatically fail by suffering from negative transfer, but that (ii) simple counter-measures on data labels avert negative transfer and necessarily result in improved performances. Supporting experiments on synthetic and real data benchmarks show that the proposed method achieves comparable performance with state-of-the-art MTL methods but at a significantly reduced computational cost.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-tiomoko23a, title = {{PCA}-based Multi-Task Learning: a Random Matrix Approach}, author = {Tiomoko, Malik and Couillet, Romain and Pascal, Frederic}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {34280--34300}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/tiomoko23a/tiomoko23a.pdf}, url = {https://proceedings.mlr.press/v202/tiomoko23a.html}, abstract = {The article proposes and theoretically analyses a computationally efficient multi-task learning (MTL) extension of popular principal component analysis (PCA)-based supervised learning schemes. The analysis reveals that (i) by default, learning may dramatically fail by suffering from negative transfer, but that (ii) simple counter-measures on data labels avert negative transfer and necessarily result in improved performances. Supporting experiments on synthetic and real data benchmarks show that the proposed method achieves comparable performance with state-of-the-art MTL methods but at a significantly reduced computational cost.} }
Endnote
%0 Conference Paper %T PCA-based Multi-Task Learning: a Random Matrix Approach %A Malik Tiomoko %A Romain Couillet %A Frederic Pascal %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-tiomoko23a %I PMLR %P 34280--34300 %U https://proceedings.mlr.press/v202/tiomoko23a.html %V 202 %X The article proposes and theoretically analyses a computationally efficient multi-task learning (MTL) extension of popular principal component analysis (PCA)-based supervised learning schemes. The analysis reveals that (i) by default, learning may dramatically fail by suffering from negative transfer, but that (ii) simple counter-measures on data labels avert negative transfer and necessarily result in improved performances. Supporting experiments on synthetic and real data benchmarks show that the proposed method achieves comparable performance with state-of-the-art MTL methods but at a significantly reduced computational cost.
APA
Tiomoko, M., Couillet, R. & Pascal, F.. (2023). PCA-based Multi-Task Learning: a Random Matrix Approach. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:34280-34300 Available from https://proceedings.mlr.press/v202/tiomoko23a.html.

Related Material