Stochastic Approximation for Online Tensorial Independent Component Analysis

Chris Junchi Li, Michael Jordan
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:3051-3106, 2021.

Abstract

Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing. In this paper, we present a convergence analysis for an online tensorial ICA algorithm, by viewing the problem as a nonconvex stochastic approximation problem. For estimating one component, we provide a dynamics-based analysis to prove that our online tensorial ICA algorithm with a specific choice of stepsize achieves a sharp finite-sample error bound. In particular, under a mild assumption on the data-generating distribution and a scaling condition such that $d^4/T$ is sufficiently small up to a polylogarithmic factor of data dimension $d$ and sample size $T$, a sharp finite-sample error bound of $\tilde{O}(\sqrt{d/T})$ can be obtained.

Cite this Paper


BibTeX
@InProceedings{pmlr-v134-li21a, title = {Stochastic Approximation for Online Tensorial Independent Component Analysis}, author = {Li, Chris Junchi and Jordan, Michael}, booktitle = {Proceedings of Thirty Fourth Conference on Learning Theory}, pages = {3051--3106}, year = {2021}, editor = {Belkin, Mikhail and Kpotufe, Samory}, volume = {134}, series = {Proceedings of Machine Learning Research}, month = {15--19 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v134/li21a/li21a.pdf}, url = {https://proceedings.mlr.press/v134/li21a.html}, abstract = {Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing. In this paper, we present a convergence analysis for an online tensorial ICA algorithm, by viewing the problem as a nonconvex stochastic approximation problem. For estimating one component, we provide a dynamics-based analysis to prove that our online tensorial ICA algorithm with a specific choice of stepsize achieves a sharp finite-sample error bound. In particular, under a mild assumption on the data-generating distribution and a scaling condition such that $d^4/T$ is sufficiently small up to a polylogarithmic factor of data dimension $d$ and sample size $T$, a sharp finite-sample error bound of $\tilde{O}(\sqrt{d/T})$ can be obtained.} }
Endnote
%0 Conference Paper %T Stochastic Approximation for Online Tensorial Independent Component Analysis %A Chris Junchi Li %A Michael Jordan %B Proceedings of Thirty Fourth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Mikhail Belkin %E Samory Kpotufe %F pmlr-v134-li21a %I PMLR %P 3051--3106 %U https://proceedings.mlr.press/v134/li21a.html %V 134 %X Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing. In this paper, we present a convergence analysis for an online tensorial ICA algorithm, by viewing the problem as a nonconvex stochastic approximation problem. For estimating one component, we provide a dynamics-based analysis to prove that our online tensorial ICA algorithm with a specific choice of stepsize achieves a sharp finite-sample error bound. In particular, under a mild assumption on the data-generating distribution and a scaling condition such that $d^4/T$ is sufficiently small up to a polylogarithmic factor of data dimension $d$ and sample size $T$, a sharp finite-sample error bound of $\tilde{O}(\sqrt{d/T})$ can be obtained.
APA
Li, C.J. & Jordan, M.. (2021). Stochastic Approximation for Online Tensorial Independent Component Analysis. Proceedings of Thirty Fourth Conference on Learning Theory, in Proceedings of Machine Learning Research 134:3051-3106 Available from https://proceedings.mlr.press/v134/li21a.html.

Related Material