Fast Algorithms for Sparse Reduced-Rank Regression

Benjamin Dubois, Jean-François Delmas, Guillaume Obozinski
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2415-2424, 2019.

Abstract

We consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduced-Rank Regression (SRRR) as a non-convex non-differentiable function of a single of the two matrices usually introduced to parametrize low-rank matrix learning problems. We study the behavior of proximal gradient algorithms for the minimization of the objective. In particular, based on an analysis of the geometry of the problem, we establish that a proximal Polyak-{Ł}ojasiewicz inequality is satisfied in a neighborhood of the set of optima under a condition on the regularization parameter. We can consequently derive linear convergence rates for the proximal gradient descent with line search and for related algorithms in a neighborhood of the optima. Our experiments show that our formulation leads to much faster learning algorithms for RRR and especially for SRRR.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-dubois19a, title = {Fast Algorithms for Sparse Reduced-Rank Regression}, author = {Dubois, Benjamin and Delmas, Jean-Fran\c{c}ois and Obozinski, Guillaume}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2415--2424}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/dubois19a/dubois19a.pdf}, url = {https://proceedings.mlr.press/v89/dubois19a.html}, abstract = {We consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduced-Rank Regression (SRRR) as a non-convex non-differentiable function of a single of the two matrices usually introduced to parametrize low-rank matrix learning problems. We study the behavior of proximal gradient algorithms for the minimization of the objective. In particular, based on an analysis of the geometry of the problem, we establish that a proximal Polyak-{Ł}ojasiewicz inequality is satisfied in a neighborhood of the set of optima under a condition on the regularization parameter. We can consequently derive linear convergence rates for the proximal gradient descent with line search and for related algorithms in a neighborhood of the optima. Our experiments show that our formulation leads to much faster learning algorithms for RRR and especially for SRRR.} }
Endnote
%0 Conference Paper %T Fast Algorithms for Sparse Reduced-Rank Regression %A Benjamin Dubois %A Jean-François Delmas %A Guillaume Obozinski %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-dubois19a %I PMLR %P 2415--2424 %U https://proceedings.mlr.press/v89/dubois19a.html %V 89 %X We consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduced-Rank Regression (SRRR) as a non-convex non-differentiable function of a single of the two matrices usually introduced to parametrize low-rank matrix learning problems. We study the behavior of proximal gradient algorithms for the minimization of the objective. In particular, based on an analysis of the geometry of the problem, we establish that a proximal Polyak-{Ł}ojasiewicz inequality is satisfied in a neighborhood of the set of optima under a condition on the regularization parameter. We can consequently derive linear convergence rates for the proximal gradient descent with line search and for related algorithms in a neighborhood of the optima. Our experiments show that our formulation leads to much faster learning algorithms for RRR and especially for SRRR.
APA
Dubois, B., Delmas, J. & Obozinski, G.. (2019). Fast Algorithms for Sparse Reduced-Rank Regression. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2415-2424 Available from https://proceedings.mlr.press/v89/dubois19a.html.

Related Material