Deep Learning for Functional Data Analysis with Adaptive Basis Layers

Junwen Yao, Jonas Mueller, Jane-Ling Wang
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:11898-11908, 2021.

Abstract

Despite their widespread success, the application of deep neural networks to functional data remains scarce today. The infinite dimensionality of functional data means standard learning algorithms can be applied only after appropriate dimension reduction, typically achieved via basis expansions. Currently, these bases are chosen a priori without the information for the task at hand and thus may not be effective for the designated task. We instead propose to adaptively learn these bases in an end-to-end fashion. We introduce neural networks that employ a new Basis Layer whose hidden units are each basis functions themselves implemented as a micro neural network. Our architecture learns to apply parsimonious dimension reduction to functional inputs that focuses only on information relevant to the target rather than irrelevant variation in the input function. Across numerous classification/regression tasks with functional data, our method empirically outperforms other types of neural networks, and we prove that our approach is statistically consistent with low generalization error.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-yao21c, title = {Deep Learning for Functional Data Analysis with Adaptive Basis Layers}, author = {Yao, Junwen and Mueller, Jonas and Wang, Jane-Ling}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {11898--11908}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/yao21c/yao21c.pdf}, url = {https://proceedings.mlr.press/v139/yao21c.html}, abstract = {Despite their widespread success, the application of deep neural networks to functional data remains scarce today. The infinite dimensionality of functional data means standard learning algorithms can be applied only after appropriate dimension reduction, typically achieved via basis expansions. Currently, these bases are chosen a priori without the information for the task at hand and thus may not be effective for the designated task. We instead propose to adaptively learn these bases in an end-to-end fashion. We introduce neural networks that employ a new Basis Layer whose hidden units are each basis functions themselves implemented as a micro neural network. Our architecture learns to apply parsimonious dimension reduction to functional inputs that focuses only on information relevant to the target rather than irrelevant variation in the input function. Across numerous classification/regression tasks with functional data, our method empirically outperforms other types of neural networks, and we prove that our approach is statistically consistent with low generalization error.} }
Endnote
%0 Conference Paper %T Deep Learning for Functional Data Analysis with Adaptive Basis Layers %A Junwen Yao %A Jonas Mueller %A Jane-Ling Wang %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-yao21c %I PMLR %P 11898--11908 %U https://proceedings.mlr.press/v139/yao21c.html %V 139 %X Despite their widespread success, the application of deep neural networks to functional data remains scarce today. The infinite dimensionality of functional data means standard learning algorithms can be applied only after appropriate dimension reduction, typically achieved via basis expansions. Currently, these bases are chosen a priori without the information for the task at hand and thus may not be effective for the designated task. We instead propose to adaptively learn these bases in an end-to-end fashion. We introduce neural networks that employ a new Basis Layer whose hidden units are each basis functions themselves implemented as a micro neural network. Our architecture learns to apply parsimonious dimension reduction to functional inputs that focuses only on information relevant to the target rather than irrelevant variation in the input function. Across numerous classification/regression tasks with functional data, our method empirically outperforms other types of neural networks, and we prove that our approach is statistically consistent with low generalization error.
APA
Yao, J., Mueller, J. & Wang, J.. (2021). Deep Learning for Functional Data Analysis with Adaptive Basis Layers. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:11898-11908 Available from https://proceedings.mlr.press/v139/yao21c.html.

Related Material