[edit]
IHT dies hard: Provable accelerated Iterative Hard Thresholding
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:188-198, 2018.
Abstract
We study – both in theory and practice– the use of momentum motions in classic iterative hard thresholding (IHT) methods. By simply modifying plain IHT, we investigate its convergence behavior on convex optimization criteria with non-convex constraints, under standard assumptions. In diverse scenaria, we observe that acceleration in IHT leads to significant improvements, compared to state of the art projected gradient descent and Frank-Wolfe variants. As a byproduct of our inspection, we study the impact of selecting the momentum parameter: similar to convex settings, two modes of behavior are observed –“rippling” and linear– depending on the level of momentum.