Asymptotic Properties of Nonparametric Estimation on Manifold

Yury Yanovich
Proceedings of the Sixth Workshop on Conformal and Probabilistic Prediction and Applications, PMLR 60:18-38, 2017.

Abstract

In many applications, the real high-dimensional data occupy only a very small part in the high dimensional ‘observation space’ whose intrinsic dimension is small. The most popular model of such data is Manifold model which assumes that the data lie on or near an unknown manifold (Data Manifold, DM) of lower dimensionality embedded in an ambient high-dimensional input space (Manifold Assumption about high-dimensional data). Manifold Learning is a Dimensionality Reduction problem under the Manifold assumption about the processed data, and its goal is to construct a low-dimensional parameterization of the DM (global low-dimensional coordinates on the DM) from a finite dataset sampled from the DM. Manifold Assumption means that local neighborhood of each manifold point is equivalent to an area of low-dimensional Euclidean space. Because of this, most of Manifold Learning algorithms include two parts: ‘local part’ in which certain characteristics reflecting low-dimensional local structure of neighborhoods of all sample points are constructed via nonparametric estimation, and ‘global part’ in which global low-dimensional coordinates on the DM are constructed by solving the certain convex optimization problem for specific cost function depending on the local characteristics. Both statistical properties of ‘local part’ and its average over manifold are considered in the paper. The article is an extension of the paper (Yanovich, 2016) for the case of nonparametric estimation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v60-yanovich17a, title = {Asymptotic Properties of Nonparametric Estimation on Manifold}, author = {Yanovich, Yury}, booktitle = {Proceedings of the Sixth Workshop on Conformal and Probabilistic Prediction and Applications}, pages = {18--38}, year = {2017}, editor = {Gammerman, Alex and Vovk, Vladimir and Luo, Zhiyuan and Papadopoulos, Harris}, volume = {60}, series = {Proceedings of Machine Learning Research}, month = {13--16 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v60/yanovich17a/yanovich17a.pdf}, url = {https://proceedings.mlr.press/v60/yanovich17a.html}, abstract = {In many applications, the real high-dimensional data occupy only a very small part in the high dimensional ‘observation space’ whose intrinsic dimension is small. The most popular model of such data is Manifold model which assumes that the data lie on or near an unknown manifold (Data Manifold, DM) of lower dimensionality embedded in an ambient high-dimensional input space (Manifold Assumption about high-dimensional data). Manifold Learning is a Dimensionality Reduction problem under the Manifold assumption about the processed data, and its goal is to construct a low-dimensional parameterization of the DM (global low-dimensional coordinates on the DM) from a finite dataset sampled from the DM. Manifold Assumption means that local neighborhood of each manifold point is equivalent to an area of low-dimensional Euclidean space. Because of this, most of Manifold Learning algorithms include two parts: ‘local part’ in which certain characteristics reflecting low-dimensional local structure of neighborhoods of all sample points are constructed via nonparametric estimation, and ‘global part’ in which global low-dimensional coordinates on the DM are constructed by solving the certain convex optimization problem for specific cost function depending on the local characteristics. Both statistical properties of ‘local part’ and its average over manifold are considered in the paper. The article is an extension of the paper (Yanovich, 2016) for the case of nonparametric estimation.} }
Endnote
%0 Conference Paper %T Asymptotic Properties of Nonparametric Estimation on Manifold %A Yury Yanovich %B Proceedings of the Sixth Workshop on Conformal and Probabilistic Prediction and Applications %C Proceedings of Machine Learning Research %D 2017 %E Alex Gammerman %E Vladimir Vovk %E Zhiyuan Luo %E Harris Papadopoulos %F pmlr-v60-yanovich17a %I PMLR %P 18--38 %U https://proceedings.mlr.press/v60/yanovich17a.html %V 60 %X In many applications, the real high-dimensional data occupy only a very small part in the high dimensional ‘observation space’ whose intrinsic dimension is small. The most popular model of such data is Manifold model which assumes that the data lie on or near an unknown manifold (Data Manifold, DM) of lower dimensionality embedded in an ambient high-dimensional input space (Manifold Assumption about high-dimensional data). Manifold Learning is a Dimensionality Reduction problem under the Manifold assumption about the processed data, and its goal is to construct a low-dimensional parameterization of the DM (global low-dimensional coordinates on the DM) from a finite dataset sampled from the DM. Manifold Assumption means that local neighborhood of each manifold point is equivalent to an area of low-dimensional Euclidean space. Because of this, most of Manifold Learning algorithms include two parts: ‘local part’ in which certain characteristics reflecting low-dimensional local structure of neighborhoods of all sample points are constructed via nonparametric estimation, and ‘global part’ in which global low-dimensional coordinates on the DM are constructed by solving the certain convex optimization problem for specific cost function depending on the local characteristics. Both statistical properties of ‘local part’ and its average over manifold are considered in the paper. The article is an extension of the paper (Yanovich, 2016) for the case of nonparametric estimation.
APA
Yanovich, Y.. (2017). Asymptotic Properties of Nonparametric Estimation on Manifold. Proceedings of the Sixth Workshop on Conformal and Probabilistic Prediction and Applications, in Proceedings of Machine Learning Research 60:18-38 Available from https://proceedings.mlr.press/v60/yanovich17a.html.

Related Material