[edit]
Accelerated and Sparse Algorithms for Approximate Personalized PageRank and Beyond
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:2852-2876, 2023.
Abstract
It has recently been shown that ISTA, an unaccelerated optimization method, presents sparse updates for the $\ell_1$-regularized undirected personalized PageRank problem (Fountoulakis et al., 2019), leading to cheap iteration complexity and providing the same guarantees as the approximate personalized PageRank algorithm (\appr{}), (Andersen et al., 2016). In this work, we design an accelerated optimization algorithm for this problem that also performs sparse updates, providing an affirmative answer to the COLT 2022 open question of (Fountoulakis et al., 2022). Acceleration provides a reduced dependence on the condition number, while the dependence on the sparsity in our updates differs from the ISTA approach. Further, we design another algorithm by using conjugate directions to achieve an exact solution while exploiting sparsity. Both algorithms lead to faster convergence for certain parameter regimes. Our findings apply beyond PageRank and work for any quadratic objective whose Hessian is a positive-definite $M$-matrix.