Geometric Conditions for Subspace-Sparse Recovery

Chong You, Rene Vidal
; Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1585-1593, 2015.

Abstract

Given a dictionary \Pi and a signal ξ= \Pi \mathbf x generated by a few \textitlinearly independent columns of \Pi, classical sparse recovery theory deals with the problem of uniquely recovering the sparse representation \mathbf x of ξ. In this work, we consider the more general case where ξlies in a low-dimensional subspace spanned by a few columns of \Pi, which are possibly \textitlinearly dependent. In this case, \mathbf x may not unique, and the goal is to recover any subset of the columns of \Pi that spans the subspace containing ξ. We call such a representation \mathbf x \textitsubspace-sparse. We study conditions under which existing pursuit methods recover a subspace-sparse representation. Such conditions reveal important geometric insights and have implications for the theory of classical sparse recovery as well as subspace clustering.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-you15, title = {Geometric Conditions for Subspace-Sparse Recovery}, author = {Chong You and Rene Vidal}, pages = {1585--1593}, year = {2015}, editor = {Francis Bach and David Blei}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/you15.pdf}, url = {http://proceedings.mlr.press/v37/you15.html}, abstract = {Given a dictionary \Pi and a signal ξ= \Pi \mathbf x generated by a few \textitlinearly independent columns of \Pi, classical sparse recovery theory deals with the problem of uniquely recovering the sparse representation \mathbf x of ξ. In this work, we consider the more general case where ξlies in a low-dimensional subspace spanned by a few columns of \Pi, which are possibly \textitlinearly dependent. In this case, \mathbf x may not unique, and the goal is to recover any subset of the columns of \Pi that spans the subspace containing ξ. We call such a representation \mathbf x \textitsubspace-sparse. We study conditions under which existing pursuit methods recover a subspace-sparse representation. Such conditions reveal important geometric insights and have implications for the theory of classical sparse recovery as well as subspace clustering.} }
Endnote
%0 Conference Paper %T Geometric Conditions for Subspace-Sparse Recovery %A Chong You %A Rene Vidal %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-you15 %I PMLR %J Proceedings of Machine Learning Research %P 1585--1593 %U http://proceedings.mlr.press %V 37 %W PMLR %X Given a dictionary \Pi and a signal ξ= \Pi \mathbf x generated by a few \textitlinearly independent columns of \Pi, classical sparse recovery theory deals with the problem of uniquely recovering the sparse representation \mathbf x of ξ. In this work, we consider the more general case where ξlies in a low-dimensional subspace spanned by a few columns of \Pi, which are possibly \textitlinearly dependent. In this case, \mathbf x may not unique, and the goal is to recover any subset of the columns of \Pi that spans the subspace containing ξ. We call such a representation \mathbf x \textitsubspace-sparse. We study conditions under which existing pursuit methods recover a subspace-sparse representation. Such conditions reveal important geometric insights and have implications for the theory of classical sparse recovery as well as subspace clustering.
RIS
TY - CPAPER TI - Geometric Conditions for Subspace-Sparse Recovery AU - Chong You AU - Rene Vidal BT - Proceedings of the 32nd International Conference on Machine Learning PY - 2015/06/01 DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-you15 PB - PMLR SP - 1585 DP - PMLR EP - 1593 L1 - http://proceedings.mlr.press/v37/you15.pdf UR - http://proceedings.mlr.press/v37/you15.html AB - Given a dictionary \Pi and a signal ξ= \Pi \mathbf x generated by a few \textitlinearly independent columns of \Pi, classical sparse recovery theory deals with the problem of uniquely recovering the sparse representation \mathbf x of ξ. In this work, we consider the more general case where ξlies in a low-dimensional subspace spanned by a few columns of \Pi, which are possibly \textitlinearly dependent. In this case, \mathbf x may not unique, and the goal is to recover any subset of the columns of \Pi that spans the subspace containing ξ. We call such a representation \mathbf x \textitsubspace-sparse. We study conditions under which existing pursuit methods recover a subspace-sparse representation. Such conditions reveal important geometric insights and have implications for the theory of classical sparse recovery as well as subspace clustering. ER -
APA
You, C. & Vidal, R.. (2015). Geometric Conditions for Subspace-Sparse Recovery. Proceedings of the 32nd International Conference on Machine Learning, in PMLR 37:1585-1593

Related Material