Perceptronic Complexity and Online Matrix Completion

Stephen Pasteris
Proceedings of The 34th International Conference on Algorithmic Learning Theory, PMLR 201:1261-1291, 2023.

Abstract

We present an online algorithm for learning a binary relation, or equivalently the components of a binary-valued matrix. We derive mistake bounds for this algorithm that are based on a novel complexity measure called the perceptronic complexity. Informally, if we consider each row of the matrix as a separate learning task, then the perceptronic complexity is the Novikoff perceptron bound for learning the whole matrix when given the optimal kernel over the columns of the matrix. Our mistake bound is equal, up to a logarithmic factor, to the perceptronic complexity of the matrix plus the perceptronic complexity of its transpose. We show how this mistake bound asymptotically outperforms those of previous algorithms for the same problem, and give examples of our bound on natural families of matrices.

Cite this Paper


BibTeX
@InProceedings{pmlr-v201-pasteris23a, title = {Perceptronic Complexity and Online Matrix Completion}, author = {Pasteris, Stephen}, booktitle = {Proceedings of The 34th International Conference on Algorithmic Learning Theory}, pages = {1261--1291}, year = {2023}, editor = {Agrawal, Shipra and Orabona, Francesco}, volume = {201}, series = {Proceedings of Machine Learning Research}, month = {20 Feb--23 Feb}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v201/pasteris23a/pasteris23a.pdf}, url = {https://proceedings.mlr.press/v201/pasteris23a.html}, abstract = {We present an online algorithm for learning a binary relation, or equivalently the components of a binary-valued matrix. We derive mistake bounds for this algorithm that are based on a novel complexity measure called the perceptronic complexity. Informally, if we consider each row of the matrix as a separate learning task, then the perceptronic complexity is the Novikoff perceptron bound for learning the whole matrix when given the optimal kernel over the columns of the matrix. Our mistake bound is equal, up to a logarithmic factor, to the perceptronic complexity of the matrix plus the perceptronic complexity of its transpose. We show how this mistake bound asymptotically outperforms those of previous algorithms for the same problem, and give examples of our bound on natural families of matrices.} }
Endnote
%0 Conference Paper %T Perceptronic Complexity and Online Matrix Completion %A Stephen Pasteris %B Proceedings of The 34th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Shipra Agrawal %E Francesco Orabona %F pmlr-v201-pasteris23a %I PMLR %P 1261--1291 %U https://proceedings.mlr.press/v201/pasteris23a.html %V 201 %X We present an online algorithm for learning a binary relation, or equivalently the components of a binary-valued matrix. We derive mistake bounds for this algorithm that are based on a novel complexity measure called the perceptronic complexity. Informally, if we consider each row of the matrix as a separate learning task, then the perceptronic complexity is the Novikoff perceptron bound for learning the whole matrix when given the optimal kernel over the columns of the matrix. Our mistake bound is equal, up to a logarithmic factor, to the perceptronic complexity of the matrix plus the perceptronic complexity of its transpose. We show how this mistake bound asymptotically outperforms those of previous algorithms for the same problem, and give examples of our bound on natural families of matrices.
APA
Pasteris, S.. (2023). Perceptronic Complexity and Online Matrix Completion. Proceedings of The 34th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 201:1261-1291 Available from https://proceedings.mlr.press/v201/pasteris23a.html.

Related Material