Forward Basis Selection for Sparse Approximation over Dictionary

Xiaotong Yuan, Shuicheng Yan
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:1377-1388, 2012.

Abstract

Recently, forward greedy selection method has been successfully applied to approximately solve sparse learning problems, characterized by a trade-off between sparsity and accuracy. In this paper, we generalize this method to the setup of sparse approximation over a pre-fixed dictionary. A fully corrective forward selection algorithm is proposed along with convergence analysis. The per-iteration computational overhead of the proposed algorithm is dominated by a subproblem of linear optimization over the dictionary and a subproblem to optimally adjust the aggregation weights. The former is cheaper in several applications than the Euclidean projection while the latter is typically an unconstrained optimization problem which is relatively easy to solve. Furthermore, we extend the proposed algorithm to the setting of non-negative/convex sparse approximation over a dictionary.Applications of our algorithms to several concrete learning problems are explored with efficiency validated on benchmark data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-yuan12, title = {Forward Basis Selection for Sparse Approximation over Dictionary}, author = {Yuan, Xiaotong and Yan, Shuicheng}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {1377--1388}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/yuan12/yuan12.pdf}, url = {https://proceedings.mlr.press/v22/yuan12.html}, abstract = {Recently, forward greedy selection method has been successfully applied to approximately solve sparse learning problems, characterized by a trade-off between sparsity and accuracy. In this paper, we generalize this method to the setup of sparse approximation over a pre-fixed dictionary. A fully corrective forward selection algorithm is proposed along with convergence analysis. The per-iteration computational overhead of the proposed algorithm is dominated by a subproblem of linear optimization over the dictionary and a subproblem to optimally adjust the aggregation weights. The former is cheaper in several applications than the Euclidean projection while the latter is typically an unconstrained optimization problem which is relatively easy to solve. Furthermore, we extend the proposed algorithm to the setting of non-negative/convex sparse approximation over a dictionary.Applications of our algorithms to several concrete learning problems are explored with efficiency validated on benchmark data sets.} }
Endnote
%0 Conference Paper %T Forward Basis Selection for Sparse Approximation over Dictionary %A Xiaotong Yuan %A Shuicheng Yan %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-yuan12 %I PMLR %P 1377--1388 %U https://proceedings.mlr.press/v22/yuan12.html %V 22 %X Recently, forward greedy selection method has been successfully applied to approximately solve sparse learning problems, characterized by a trade-off between sparsity and accuracy. In this paper, we generalize this method to the setup of sparse approximation over a pre-fixed dictionary. A fully corrective forward selection algorithm is proposed along with convergence analysis. The per-iteration computational overhead of the proposed algorithm is dominated by a subproblem of linear optimization over the dictionary and a subproblem to optimally adjust the aggregation weights. The former is cheaper in several applications than the Euclidean projection while the latter is typically an unconstrained optimization problem which is relatively easy to solve. Furthermore, we extend the proposed algorithm to the setting of non-negative/convex sparse approximation over a dictionary.Applications of our algorithms to several concrete learning problems are explored with efficiency validated on benchmark data sets.
RIS
TY - CPAPER TI - Forward Basis Selection for Sparse Approximation over Dictionary AU - Xiaotong Yuan AU - Shuicheng Yan BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-yuan12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 1377 EP - 1388 L1 - http://proceedings.mlr.press/v22/yuan12/yuan12.pdf UR - https://proceedings.mlr.press/v22/yuan12.html AB - Recently, forward greedy selection method has been successfully applied to approximately solve sparse learning problems, characterized by a trade-off between sparsity and accuracy. In this paper, we generalize this method to the setup of sparse approximation over a pre-fixed dictionary. A fully corrective forward selection algorithm is proposed along with convergence analysis. The per-iteration computational overhead of the proposed algorithm is dominated by a subproblem of linear optimization over the dictionary and a subproblem to optimally adjust the aggregation weights. The former is cheaper in several applications than the Euclidean projection while the latter is typically an unconstrained optimization problem which is relatively easy to solve. Furthermore, we extend the proposed algorithm to the setting of non-negative/convex sparse approximation over a dictionary.Applications of our algorithms to several concrete learning problems are explored with efficiency validated on benchmark data sets. ER -
APA
Yuan, X. & Yan, S.. (2012). Forward Basis Selection for Sparse Approximation over Dictionary. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:1377-1388 Available from https://proceedings.mlr.press/v22/yuan12.html.

Related Material