Sparsity and the Truncated $l^2$-norm

Lee Dicker
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:159-166, 2014.

Abstract

Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the $l^p$-norms, for $p < 2$. In this paper, we study an alternative measure of sparsity, the truncated $l^2$-norm, which is related to other $l^p$-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated $l^2$-balls, which complement existing results for $l^p$-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated $l^2$-norm and are adaptive asymptotic minimax over $l^p$-balls ($p < 2$), as well as truncated $l^2$-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter’s truncated $l^2$-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-dicker14, title = {Sparsity and the Truncated $l^2$-norm}, author = {Dicker, Lee}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {159--166}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/dicker14.pdf}, url = {https://proceedings.mlr.press/v33/dicker14.html}, abstract = {Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the $l^p$-norms, for $p < 2$. In this paper, we study an alternative measure of sparsity, the truncated $l^2$-norm, which is related to other $l^p$-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated $l^2$-balls, which complement existing results for $l^p$-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated $l^2$-norm and are adaptive asymptotic minimax over $l^p$-balls ($p < 2$), as well as truncated $l^2$-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter’s truncated $l^2$-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling.} }
Endnote
%0 Conference Paper %T Sparsity and the Truncated $l^2$-norm %A Lee Dicker %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-dicker14 %I PMLR %P 159--166 %U https://proceedings.mlr.press/v33/dicker14.html %V 33 %X Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the $l^p$-norms, for $p < 2$. In this paper, we study an alternative measure of sparsity, the truncated $l^2$-norm, which is related to other $l^p$-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated $l^2$-balls, which complement existing results for $l^p$-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated $l^2$-norm and are adaptive asymptotic minimax over $l^p$-balls ($p < 2$), as well as truncated $l^2$-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter’s truncated $l^2$-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling.
RIS
TY - CPAPER TI - Sparsity and the Truncated $l^2$-norm AU - Lee Dicker BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-dicker14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 159 EP - 166 L1 - http://proceedings.mlr.press/v33/dicker14.pdf UR - https://proceedings.mlr.press/v33/dicker14.html AB - Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the $l^p$-norms, for $p < 2$. In this paper, we study an alternative measure of sparsity, the truncated $l^2$-norm, which is related to other $l^p$-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated $l^2$-balls, which complement existing results for $l^p$-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated $l^2$-norm and are adaptive asymptotic minimax over $l^p$-balls ($p < 2$), as well as truncated $l^2$-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter’s truncated $l^2$-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling. ER -
APA
Dicker, L.. (2014). Sparsity and the Truncated $l^2$-norm. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:159-166 Available from https://proceedings.mlr.press/v33/dicker14.html.

Related Material