LowRank Riemannian Optimization on Positive Semidefinite Stochastic Matrices with Applications to Graph Clustering
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:12991308, 2018.
Abstract
This paper develops a Riemannian optimization framework for solving optimization problems on the set of symmetric positive semidefinite stochastic matrices. The paper first reformulates the problem by factorizing the optimization variable as $\mathbf{X}=\mathbf{Y}\mathbf{Y}^T$ and deriving conditions on $p$, i.e., the number of columns of $\mathbf{Y}$, under which the factorization yields a satisfactory solution. The reparameterization of the problem allows its formulation as an optimization over either an embedded or quotient Riemannian manifold whose geometries are investigated. In particular, the paper explicitly derives the tangent space, Riemannian gradients and retraction operator that allow the design of efficient optimization methods on the proposed manifolds. The numerical results reveal that, when the optimal solution has a known lowrank, the resulting algorithms present a clear complexity advantage when compared with stateoftheart Euclidean and Riemannian approaches for graph clustering applications.
Related Material


