Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression

Mathurin Massias, Olivier Fercoq, Alexandre Gramfort, Joseph Salmon
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:998-1007, 2018.

Abstract

In high dimension, it is customary to consider Lasso-type estimators to enforce sparsity. For standard Lasso theory to hold, the regularization parameter should be proportional to the noise level, which is often unknown in practice. A remedy is to consider estimators such as the Concomitant Lasso, which jointly optimize over the regression coefficients and the noise level. However, when data from different sources are pooled to increase sample size, noise levels differ and new dedicated estimators are needed. We provide new statistical and computational solutions to perform heteroscedastic regression, with an emphasis on functional brain imaging with magneto- and electroencephalography (M/EEG). When instantiated to de-correlated noise, our framework leads to an efficient algorithm whose computational cost is not higher than for the Lasso, but addresses more complex noise structures. Experiments demonstrate improved prediction and support identification with correct estimation of noise levels.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-massias18a, title = {Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression}, author = {Massias, Mathurin and Fercoq, Olivier and Gramfort, Alexandre and Salmon, Joseph}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {998--1007}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/massias18a/massias18a.pdf}, url = {https://proceedings.mlr.press/v84/massias18a.html}, abstract = {In high dimension, it is customary to consider Lasso-type estimators to enforce sparsity. For standard Lasso theory to hold, the regularization parameter should be proportional to the noise level, which is often unknown in practice. A remedy is to consider estimators such as the Concomitant Lasso, which jointly optimize over the regression coefficients and the noise level. However, when data from different sources are pooled to increase sample size, noise levels differ and new dedicated estimators are needed. We provide new statistical and computational solutions to perform heteroscedastic regression, with an emphasis on functional brain imaging with magneto- and electroencephalography (M/EEG). When instantiated to de-correlated noise, our framework leads to an efficient algorithm whose computational cost is not higher than for the Lasso, but addresses more complex noise structures. Experiments demonstrate improved prediction and support identification with correct estimation of noise levels.} }
Endnote
%0 Conference Paper %T Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression %A Mathurin Massias %A Olivier Fercoq %A Alexandre Gramfort %A Joseph Salmon %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-massias18a %I PMLR %P 998--1007 %U https://proceedings.mlr.press/v84/massias18a.html %V 84 %X In high dimension, it is customary to consider Lasso-type estimators to enforce sparsity. For standard Lasso theory to hold, the regularization parameter should be proportional to the noise level, which is often unknown in practice. A remedy is to consider estimators such as the Concomitant Lasso, which jointly optimize over the regression coefficients and the noise level. However, when data from different sources are pooled to increase sample size, noise levels differ and new dedicated estimators are needed. We provide new statistical and computational solutions to perform heteroscedastic regression, with an emphasis on functional brain imaging with magneto- and electroencephalography (M/EEG). When instantiated to de-correlated noise, our framework leads to an efficient algorithm whose computational cost is not higher than for the Lasso, but addresses more complex noise structures. Experiments demonstrate improved prediction and support identification with correct estimation of noise levels.
APA
Massias, M., Fercoq, O., Gramfort, A. & Salmon, J.. (2018). Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:998-1007 Available from https://proceedings.mlr.press/v84/massias18a.html.

Related Material