Principal Variety Analysis

Reza Iraji, Hamidreza Chitsaz
Proceedings of the 1st Annual Conference on Robot Learning, PMLR 78:97-108, 2017.

Abstract

We introduce a novel computational framework, Principal Variety Analysis (PVA), for primarily nonlinear data modeling. PVA accommodates algebraic sets as the target subspace through which limitations of other existing approaches is dealt with. PVA power is demonstrated in this paper for learning kinematics of objects, as an important application. PVA takes recorded coordinates of some pre-specified features on the objects as input and outputs a lowest dimensional variety on which the feature coordinates jointly lie. Unlike existing object modeling methods, which require entire trajectories of objects, PVA requires much less information and provides more flexible and generalizable models, namely an analytical algebraic kinematic model of the objects, even in unstructured, uncertain environments. Moreover, it is not restricted to predetermined model templates and is capable of extracting much more general types of models. Besides finding the kinematic model of objects, PVA can be a powerful tool to estimate their corresponding degrees of freedom. PVA computational success depends on exploiting sparsity, in particular algebraic dimension minimization through replacement of intractable $\ell_0$ norm (rank) with tractable $\ell_1$ norm (nuclear norm). Complete characterization of the assumptions under which $\ell_0$ and $\ell_1$ norm minimizations yield virtually the same outcome is introduced as an important open problem in this paper.

Cite this Paper


BibTeX
@InProceedings{pmlr-v78-iraji17a, title = {Principal Variety Analysis}, author = {Iraji, Reza and Chitsaz, Hamidreza}, booktitle = {Proceedings of the 1st Annual Conference on Robot Learning}, pages = {97--108}, year = {2017}, editor = {Levine, Sergey and Vanhoucke, Vincent and Goldberg, Ken}, volume = {78}, series = {Proceedings of Machine Learning Research}, month = {13--15 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v78/iraji17a/iraji17a.pdf}, url = {https://proceedings.mlr.press/v78/iraji17a.html}, abstract = {We introduce a novel computational framework, Principal Variety Analysis (PVA), for primarily nonlinear data modeling. PVA accommodates algebraic sets as the target subspace through which limitations of other existing approaches is dealt with. PVA power is demonstrated in this paper for learning kinematics of objects, as an important application. PVA takes recorded coordinates of some pre-specified features on the objects as input and outputs a lowest dimensional variety on which the feature coordinates jointly lie. Unlike existing object modeling methods, which require entire trajectories of objects, PVA requires much less information and provides more flexible and generalizable models, namely an analytical algebraic kinematic model of the objects, even in unstructured, uncertain environments. Moreover, it is not restricted to predetermined model templates and is capable of extracting much more general types of models. Besides finding the kinematic model of objects, PVA can be a powerful tool to estimate their corresponding degrees of freedom. PVA computational success depends on exploiting sparsity, in particular algebraic dimension minimization through replacement of intractable $\ell_0$ norm (rank) with tractable $\ell_1$ norm (nuclear norm). Complete characterization of the assumptions under which $\ell_0$ and $\ell_1$ norm minimizations yield virtually the same outcome is introduced as an important open problem in this paper.} }
Endnote
%0 Conference Paper %T Principal Variety Analysis %A Reza Iraji %A Hamidreza Chitsaz %B Proceedings of the 1st Annual Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2017 %E Sergey Levine %E Vincent Vanhoucke %E Ken Goldberg %F pmlr-v78-iraji17a %I PMLR %P 97--108 %U https://proceedings.mlr.press/v78/iraji17a.html %V 78 %X We introduce a novel computational framework, Principal Variety Analysis (PVA), for primarily nonlinear data modeling. PVA accommodates algebraic sets as the target subspace through which limitations of other existing approaches is dealt with. PVA power is demonstrated in this paper for learning kinematics of objects, as an important application. PVA takes recorded coordinates of some pre-specified features on the objects as input and outputs a lowest dimensional variety on which the feature coordinates jointly lie. Unlike existing object modeling methods, which require entire trajectories of objects, PVA requires much less information and provides more flexible and generalizable models, namely an analytical algebraic kinematic model of the objects, even in unstructured, uncertain environments. Moreover, it is not restricted to predetermined model templates and is capable of extracting much more general types of models. Besides finding the kinematic model of objects, PVA can be a powerful tool to estimate their corresponding degrees of freedom. PVA computational success depends on exploiting sparsity, in particular algebraic dimension minimization through replacement of intractable $\ell_0$ norm (rank) with tractable $\ell_1$ norm (nuclear norm). Complete characterization of the assumptions under which $\ell_0$ and $\ell_1$ norm minimizations yield virtually the same outcome is introduced as an important open problem in this paper.
APA
Iraji, R. & Chitsaz, H.. (2017). Principal Variety Analysis. Proceedings of the 1st Annual Conference on Robot Learning, in Proceedings of Machine Learning Research 78:97-108 Available from https://proceedings.mlr.press/v78/iraji17a.html.

Related Material