C*-algebra Net: A New Approach Generalizing Neural Network Parameters to C*-algebra

Yuka Hashimoto, Zhao Wang, Tomoko Matsui
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:8523-8534, 2022.

Abstract

We propose a new framework that generalizes the parameters of neural network models to $C^*$-algebra-valued ones. $C^*$-algebra is a generalization of the space of complex numbers. A typical example is the space of continuous functions on a compact space. This generalization enables us to combine multiple models continuously and use tools for functions such as regression and integration. Consequently, we can learn features of data efficiently and adapt the models to problems continuously. We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples. Our new framework highlights the potential possibility of applying the theory of $C^*$-algebra to general neural network models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-hashimoto22a, title = {C*-algebra Net: A New Approach Generalizing Neural Network Parameters to C*-algebra}, author = {Hashimoto, Yuka and Wang, Zhao and Matsui, Tomoko}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {8523--8534}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/hashimoto22a/hashimoto22a.pdf}, url = {https://proceedings.mlr.press/v162/hashimoto22a.html}, abstract = {We propose a new framework that generalizes the parameters of neural network models to $C^*$-algebra-valued ones. $C^*$-algebra is a generalization of the space of complex numbers. A typical example is the space of continuous functions on a compact space. This generalization enables us to combine multiple models continuously and use tools for functions such as regression and integration. Consequently, we can learn features of data efficiently and adapt the models to problems continuously. We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples. Our new framework highlights the potential possibility of applying the theory of $C^*$-algebra to general neural network models.} }
Endnote
%0 Conference Paper %T C*-algebra Net: A New Approach Generalizing Neural Network Parameters to C*-algebra %A Yuka Hashimoto %A Zhao Wang %A Tomoko Matsui %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-hashimoto22a %I PMLR %P 8523--8534 %U https://proceedings.mlr.press/v162/hashimoto22a.html %V 162 %X We propose a new framework that generalizes the parameters of neural network models to $C^*$-algebra-valued ones. $C^*$-algebra is a generalization of the space of complex numbers. A typical example is the space of continuous functions on a compact space. This generalization enables us to combine multiple models continuously and use tools for functions such as regression and integration. Consequently, we can learn features of data efficiently and adapt the models to problems continuously. We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples. Our new framework highlights the potential possibility of applying the theory of $C^*$-algebra to general neural network models.
APA
Hashimoto, Y., Wang, Z. & Matsui, T.. (2022). C*-algebra Net: A New Approach Generalizing Neural Network Parameters to C*-algebra. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:8523-8534 Available from https://proceedings.mlr.press/v162/hashimoto22a.html.

Related Material