Accelerated and Sparse Algorithms for Approximate Personalized PageRank and Beyond

David Martínez-Rubio, Elias Wirth, Sebastian Pokutta
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:2852-2876, 2023.

Abstract

It has recently been shown that ISTA, an unaccelerated optimization method, presents sparse updates for the $\ell_1$-regularized undirected personalized PageRank problem (Fountoulakis et al., 2019), leading to cheap iteration complexity and providing the same guarantees as the approximate personalized PageRank algorithm (\appr{}), (Andersen et al., 2016). In this work, we design an accelerated optimization algorithm for this problem that also performs sparse updates, providing an affirmative answer to the COLT 2022 open question of (Fountoulakis et al., 2022). Acceleration provides a reduced dependence on the condition number, while the dependence on the sparsity in our updates differs from the ISTA approach. Further, we design another algorithm by using conjugate directions to achieve an exact solution while exploiting sparsity. Both algorithms lead to faster convergence for certain parameter regimes. Our findings apply beyond PageRank and work for any quadratic objective whose Hessian is a positive-definite $M$-matrix.

Cite this Paper


BibTeX
@InProceedings{pmlr-v195-martinez-rubio23b, title = {Accelerated and Sparse Algorithms for Approximate Personalized PageRank and Beyond}, author = {Mart{\'i}nez-Rubio, David and Wirth, Elias and Pokutta, Sebastian}, booktitle = {Proceedings of Thirty Sixth Conference on Learning Theory}, pages = {2852--2876}, year = {2023}, editor = {Neu, Gergely and Rosasco, Lorenzo}, volume = {195}, series = {Proceedings of Machine Learning Research}, month = {12--15 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v195/martinez-rubio23b/martinez-rubio23b.pdf}, url = {https://proceedings.mlr.press/v195/martinez-rubio23b.html}, abstract = { It has recently been shown that ISTA, an unaccelerated optimization method, presents sparse updates for the $\ell_1$-regularized undirected personalized PageRank problem (Fountoulakis et al., 2019), leading to cheap iteration complexity and providing the same guarantees as the approximate personalized PageRank algorithm (\appr{}), (Andersen et al., 2016). In this work, we design an accelerated optimization algorithm for this problem that also performs sparse updates, providing an affirmative answer to the COLT 2022 open question of (Fountoulakis et al., 2022). Acceleration provides a reduced dependence on the condition number, while the dependence on the sparsity in our updates differs from the ISTA approach. Further, we design another algorithm by using conjugate directions to achieve an exact solution while exploiting sparsity. Both algorithms lead to faster convergence for certain parameter regimes. Our findings apply beyond PageRank and work for any quadratic objective whose Hessian is a positive-definite $M$-matrix.} }
Endnote
%0 Conference Paper %T Accelerated and Sparse Algorithms for Approximate Personalized PageRank and Beyond %A David Martínez-Rubio %A Elias Wirth %A Sebastian Pokutta %B Proceedings of Thirty Sixth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Gergely Neu %E Lorenzo Rosasco %F pmlr-v195-martinez-rubio23b %I PMLR %P 2852--2876 %U https://proceedings.mlr.press/v195/martinez-rubio23b.html %V 195 %X It has recently been shown that ISTA, an unaccelerated optimization method, presents sparse updates for the $\ell_1$-regularized undirected personalized PageRank problem (Fountoulakis et al., 2019), leading to cheap iteration complexity and providing the same guarantees as the approximate personalized PageRank algorithm (\appr{}), (Andersen et al., 2016). In this work, we design an accelerated optimization algorithm for this problem that also performs sparse updates, providing an affirmative answer to the COLT 2022 open question of (Fountoulakis et al., 2022). Acceleration provides a reduced dependence on the condition number, while the dependence on the sparsity in our updates differs from the ISTA approach. Further, we design another algorithm by using conjugate directions to achieve an exact solution while exploiting sparsity. Both algorithms lead to faster convergence for certain parameter regimes. Our findings apply beyond PageRank and work for any quadratic objective whose Hessian is a positive-definite $M$-matrix.
APA
Martínez-Rubio, D., Wirth, E. & Pokutta, S.. (2023). Accelerated and Sparse Algorithms for Approximate Personalized PageRank and Beyond. Proceedings of Thirty Sixth Conference on Learning Theory, in Proceedings of Machine Learning Research 195:2852-2876 Available from https://proceedings.mlr.press/v195/martinez-rubio23b.html.

Related Material