Online Infinite-Dimensional Regression: Learning Linear Operators

Unique Subedi, Vinod Raman, Ambuj Tewari
Proceedings of The 35th International Conference on Algorithmic Learning Theory, PMLR 237:1113-1133, 2024.

Abstract

We consider the problem of learning linear operators under squared loss between two infinite-dimensional Hilbert spaces in the online setting. We show that the class of linear operators with uniformly bounded $p$-Schatten norm is online learnable for any $p \in [1, \infty)$. On the other hand, we prove an impossibility result by showing that the class of uniformly bounded linear operators with respect to the operator norm is \textit{not} online learnable. Moreover, we show a separation between sequential uniform convergence and online learnability by identifying a class of bounded linear operators that is online learnable but uniform convergence does not hold. Finally, we prove that the impossibility result and the separation between uniform convergence and learnability also hold in the batch setting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v237-subedi24a, title = {Online Infinite-Dimensional Regression: Learning Linear Operators}, author = {Subedi, Unique and Raman, Vinod and Tewari, Ambuj}, booktitle = {Proceedings of The 35th International Conference on Algorithmic Learning Theory}, pages = {1113--1133}, year = {2024}, editor = {Vernade, Claire and Hsu, Daniel}, volume = {237}, series = {Proceedings of Machine Learning Research}, month = {25--28 Feb}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v237/subedi24a/subedi24a.pdf}, url = {https://proceedings.mlr.press/v237/subedi24a.html}, abstract = {We consider the problem of learning linear operators under squared loss between two infinite-dimensional Hilbert spaces in the online setting. We show that the class of linear operators with uniformly bounded $p$-Schatten norm is online learnable for any $p \in [1, \infty)$. On the other hand, we prove an impossibility result by showing that the class of uniformly bounded linear operators with respect to the operator norm is \textit{not} online learnable. Moreover, we show a separation between sequential uniform convergence and online learnability by identifying a class of bounded linear operators that is online learnable but uniform convergence does not hold. Finally, we prove that the impossibility result and the separation between uniform convergence and learnability also hold in the batch setting.} }
Endnote
%0 Conference Paper %T Online Infinite-Dimensional Regression: Learning Linear Operators %A Unique Subedi %A Vinod Raman %A Ambuj Tewari %B Proceedings of The 35th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Claire Vernade %E Daniel Hsu %F pmlr-v237-subedi24a %I PMLR %P 1113--1133 %U https://proceedings.mlr.press/v237/subedi24a.html %V 237 %X We consider the problem of learning linear operators under squared loss between two infinite-dimensional Hilbert spaces in the online setting. We show that the class of linear operators with uniformly bounded $p$-Schatten norm is online learnable for any $p \in [1, \infty)$. On the other hand, we prove an impossibility result by showing that the class of uniformly bounded linear operators with respect to the operator norm is \textit{not} online learnable. Moreover, we show a separation between sequential uniform convergence and online learnability by identifying a class of bounded linear operators that is online learnable but uniform convergence does not hold. Finally, we prove that the impossibility result and the separation between uniform convergence and learnability also hold in the batch setting.
APA
Subedi, U., Raman, V. & Tewari, A.. (2024). Online Infinite-Dimensional Regression: Learning Linear Operators. Proceedings of The 35th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 237:1113-1133 Available from https://proceedings.mlr.press/v237/subedi24a.html.

Related Material