An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization

Qihang Lin, Lin Xiao
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):73-81, 2014.

Abstract

We first propose an adaptive accelerated proximal gradient(APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the ℓ1-regularized least-squares (ℓ1-LS) problem in the high-dimensional setting. Although such an objective function is not strongly convex, it has restricted strong convexity over sparse vectors. We exploit this property by combining the adaptive APG method with a homotopy continuation scheme, which generates a sparse solution path towards optimality. This method obtains a global linear rate of convergence and its overall iteration complexity has a weaker dependency on the restricted condition number than previous work.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-lin14, title = {An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization}, author = {Lin, Qihang and Xiao, Lin}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {73--81}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/lin14.pdf}, url = {https://proceedings.mlr.press/v32/lin14.html}, abstract = {We first propose an adaptive accelerated proximal gradient(APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the ℓ1-regularized least-squares (ℓ1-LS) problem in the high-dimensional setting. Although such an objective function is not strongly convex, it has restricted strong convexity over sparse vectors. We exploit this property by combining the adaptive APG method with a homotopy continuation scheme, which generates a sparse solution path towards optimality. This method obtains a global linear rate of convergence and its overall iteration complexity has a weaker dependency on the restricted condition number than previous work.} }
Endnote
%0 Conference Paper %T An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization %A Qihang Lin %A Lin Xiao %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-lin14 %I PMLR %P 73--81 %U https://proceedings.mlr.press/v32/lin14.html %V 32 %N 1 %X We first propose an adaptive accelerated proximal gradient(APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the ℓ1-regularized least-squares (ℓ1-LS) problem in the high-dimensional setting. Although such an objective function is not strongly convex, it has restricted strong convexity over sparse vectors. We exploit this property by combining the adaptive APG method with a homotopy continuation scheme, which generates a sparse solution path towards optimality. This method obtains a global linear rate of convergence and its overall iteration complexity has a weaker dependency on the restricted condition number than previous work.
RIS
TY - CPAPER TI - An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization AU - Qihang Lin AU - Lin Xiao BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-lin14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 73 EP - 81 L1 - http://proceedings.mlr.press/v32/lin14.pdf UR - https://proceedings.mlr.press/v32/lin14.html AB - We first propose an adaptive accelerated proximal gradient(APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the ℓ1-regularized least-squares (ℓ1-LS) problem in the high-dimensional setting. Although such an objective function is not strongly convex, it has restricted strong convexity over sparse vectors. We exploit this property by combining the adaptive APG method with a homotopy continuation scheme, which generates a sparse solution path towards optimality. This method obtains a global linear rate of convergence and its overall iteration complexity has a weaker dependency on the restricted condition number than previous work. ER -
APA
Lin, Q. & Xiao, L.. (2014). An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):73-81 Available from https://proceedings.mlr.press/v32/lin14.html.

Related Material