Nonparametric Teaching of Implicit Neural Representations

Chen Zhang, Steven Tin Sui Luo, Jason Chun Lok Li, Yik Chung Wu, Ngai Wong
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:59435-59458, 2024.

Abstract

We investigate the learning of implicit neural representation (INR) using an overparameterized multilayer perceptron (MLP) via a novel nonparametric teaching perspective. The latter offers an efficient example selection framework for teaching nonparametrically defined (viz. non-closed-form) target functions, such as image functions defined by 2D grids of pixels. To address the costly training of INRs, we propose a paradigm called Implicit Neural Teaching (INT) that treats INR learning as a nonparametric teaching problem, where the given signal being fitted serves as the target function. The teacher then selects signal fragments for iterative training of the MLP to achieve fast convergence. By establishing a connection between MLP evolution through parameter-based gradient descent and that of function evolution through functional gradient descent in nonparametric teaching, we show for the first time that teaching an overparameterized MLP is consistent with teaching a nonparametric learner. This new discovery readily permits a convenient drop-in of nonparametric teaching algorithms to broadly enhance INR training efficiency, demonstrating 30%+ training time savings across various input modalities.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-zhang24ap, title = {Nonparametric Teaching of Implicit Neural Representations}, author = {Zhang, Chen and Luo, Steven Tin Sui and Li, Jason Chun Lok and Wu, Yik Chung and Wong, Ngai}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {59435--59458}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/zhang24ap/zhang24ap.pdf}, url = {https://proceedings.mlr.press/v235/zhang24ap.html}, abstract = {We investigate the learning of implicit neural representation (INR) using an overparameterized multilayer perceptron (MLP) via a novel nonparametric teaching perspective. The latter offers an efficient example selection framework for teaching nonparametrically defined (viz. non-closed-form) target functions, such as image functions defined by 2D grids of pixels. To address the costly training of INRs, we propose a paradigm called Implicit Neural Teaching (INT) that treats INR learning as a nonparametric teaching problem, where the given signal being fitted serves as the target function. The teacher then selects signal fragments for iterative training of the MLP to achieve fast convergence. By establishing a connection between MLP evolution through parameter-based gradient descent and that of function evolution through functional gradient descent in nonparametric teaching, we show for the first time that teaching an overparameterized MLP is consistent with teaching a nonparametric learner. This new discovery readily permits a convenient drop-in of nonparametric teaching algorithms to broadly enhance INR training efficiency, demonstrating 30%+ training time savings across various input modalities.} }
Endnote
%0 Conference Paper %T Nonparametric Teaching of Implicit Neural Representations %A Chen Zhang %A Steven Tin Sui Luo %A Jason Chun Lok Li %A Yik Chung Wu %A Ngai Wong %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-zhang24ap %I PMLR %P 59435--59458 %U https://proceedings.mlr.press/v235/zhang24ap.html %V 235 %X We investigate the learning of implicit neural representation (INR) using an overparameterized multilayer perceptron (MLP) via a novel nonparametric teaching perspective. The latter offers an efficient example selection framework for teaching nonparametrically defined (viz. non-closed-form) target functions, such as image functions defined by 2D grids of pixels. To address the costly training of INRs, we propose a paradigm called Implicit Neural Teaching (INT) that treats INR learning as a nonparametric teaching problem, where the given signal being fitted serves as the target function. The teacher then selects signal fragments for iterative training of the MLP to achieve fast convergence. By establishing a connection between MLP evolution through parameter-based gradient descent and that of function evolution through functional gradient descent in nonparametric teaching, we show for the first time that teaching an overparameterized MLP is consistent with teaching a nonparametric learner. This new discovery readily permits a convenient drop-in of nonparametric teaching algorithms to broadly enhance INR training efficiency, demonstrating 30%+ training time savings across various input modalities.
APA
Zhang, C., Luo, S.T.S., Li, J.C.L., Wu, Y.C. & Wong, N.. (2024). Nonparametric Teaching of Implicit Neural Representations. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:59435-59458 Available from https://proceedings.mlr.press/v235/zhang24ap.html.

Related Material