Active Learning for Single Neuron Models with Lipschitz Non-Linearities

Aarshvi Gajjar, Christopher Musco, Chinmay Hegde
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:4101-4113, 2023.

Abstract

We consider the problem of active learning for single neuron models, also sometimes called “ridge functions”, in the agnostic setting (under adversarial label noise). Such models have been shown to be broadly effective in modeling physical phenomena, and for constructing surrogate data-driven models for partial differential equations. Surprisingly, we show that for a single neuron model with any Lipschitz non-linearity (such as the ReLU, sigmoid, absolute value, low-degree polynomial, among others), strong provable approximation guarantees can be obtained using a well-known active learning strategy for fitting linear functions in the agnostic setting. Namely, we can collect samples via statistical leverage score sampling, which has been shown to be nearoptimal in other active learning scenarios. We support our theoretical results with empirical simulations showing that our proposed active learning strategy based on leverage score sampling outperforms (ordinary) uniform sampling when fitting single neuron models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-gajjar23a, title = {Active Learning for Single Neuron Models with Lipschitz Non-Linearities}, author = {Gajjar, Aarshvi and Musco, Christopher and Hegde, Chinmay}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {4101--4113}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/gajjar23a/gajjar23a.pdf}, url = {https://proceedings.mlr.press/v206/gajjar23a.html}, abstract = {We consider the problem of active learning for single neuron models, also sometimes called “ridge functions”, in the agnostic setting (under adversarial label noise). Such models have been shown to be broadly effective in modeling physical phenomena, and for constructing surrogate data-driven models for partial differential equations. Surprisingly, we show that for a single neuron model with any Lipschitz non-linearity (such as the ReLU, sigmoid, absolute value, low-degree polynomial, among others), strong provable approximation guarantees can be obtained using a well-known active learning strategy for fitting linear functions in the agnostic setting. Namely, we can collect samples via statistical leverage score sampling, which has been shown to be nearoptimal in other active learning scenarios. We support our theoretical results with empirical simulations showing that our proposed active learning strategy based on leverage score sampling outperforms (ordinary) uniform sampling when fitting single neuron models.} }
Endnote
%0 Conference Paper %T Active Learning for Single Neuron Models with Lipschitz Non-Linearities %A Aarshvi Gajjar %A Christopher Musco %A Chinmay Hegde %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-gajjar23a %I PMLR %P 4101--4113 %U https://proceedings.mlr.press/v206/gajjar23a.html %V 206 %X We consider the problem of active learning for single neuron models, also sometimes called “ridge functions”, in the agnostic setting (under adversarial label noise). Such models have been shown to be broadly effective in modeling physical phenomena, and for constructing surrogate data-driven models for partial differential equations. Surprisingly, we show that for a single neuron model with any Lipschitz non-linearity (such as the ReLU, sigmoid, absolute value, low-degree polynomial, among others), strong provable approximation guarantees can be obtained using a well-known active learning strategy for fitting linear functions in the agnostic setting. Namely, we can collect samples via statistical leverage score sampling, which has been shown to be nearoptimal in other active learning scenarios. We support our theoretical results with empirical simulations showing that our proposed active learning strategy based on leverage score sampling outperforms (ordinary) uniform sampling when fitting single neuron models.
APA
Gajjar, A., Musco, C. & Hegde, C.. (2023). Active Learning for Single Neuron Models with Lipschitz Non-Linearities. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:4101-4113 Available from https://proceedings.mlr.press/v206/gajjar23a.html.

Related Material