Statistical Sparse Online Regression: A Diffusion Approximation Perspective

[edit]

Jianqing Fan, Wenyan Gong, Chris Junchi Li, Qiang Sun ;
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1017-1026, 2018.

Abstract

In this paper, we propose to adopt the diffusion approximation techniques to study online regression. The diffusion approximation techniques allow us to characterize the exact dynamics of the online regression process. As a consequence, we obtain the optimal statistical rate of convergence up to a logarithmic factor of the streaming sample size. Using the idea of trajectory averaging, we further improve the rate of convergence by eliminating the logarithmic factor. Lastly, we propose a two-step algorithm for sparse online regression: a burn-in step using offline learning and a refinement step using a variant of truncated stochastic gradient descent. Under appropriate assumptions, we show the proposed algorithm produces near optimal sparse estimators. Numerical experiments lend further support to our obtained theory.

Related Material