A case for new neural network smoothness constraints

Mihaela Rosca, Theophane Weber, Arthur Gretton, Shakir Mohamed
Proceedings on "I Can't Believe It's Not Better!" at NeurIPS Workshops, PMLR 137:21-32, 2020.

Abstract

How sensitive should machine learning models be to input changes? We tackle the question of model smoothness and show that it is a useful inductive bias which aids generalization, adversarial robustness, generative modeling and reinforcement learning. We explore current methods of imposing smoothness constraints and observe they lack the flexibility to adapt to new tasks, they don’t account for data modalities, they interact with losses, architectures and optimization in ways not yet fully understood. We conclude that new advances in the field are hinging on finding ways to incorporate data, tasks and learning into our definitions of smoothness.

Cite this Paper


BibTeX
@InProceedings{pmlr-v137-rosca20a, title = {A case for new neural network smoothness constraints}, author = {Rosca, Mihaela and Weber, Theophane and Gretton, Arthur and Mohamed, Shakir}, booktitle = {Proceedings on "I Can't Believe It's Not Better!" at NeurIPS Workshops}, pages = {21--32}, year = {2020}, editor = {Zosa Forde, Jessica and Ruiz, Francisco and Pradier, Melanie F. and Schein, Aaron}, volume = {137}, series = {Proceedings of Machine Learning Research}, month = {12 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v137/rosca20a/rosca20a.pdf}, url = {https://proceedings.mlr.press/v137/rosca20a.html}, abstract = {How sensitive should machine learning models be to input changes? We tackle the question of model smoothness and show that it is a useful inductive bias which aids generalization, adversarial robustness, generative modeling and reinforcement learning. We explore current methods of imposing smoothness constraints and observe they lack the flexibility to adapt to new tasks, they don’t account for data modalities, they interact with losses, architectures and optimization in ways not yet fully understood. We conclude that new advances in the field are hinging on finding ways to incorporate data, tasks and learning into our definitions of smoothness.} }
Endnote
%0 Conference Paper %T A case for new neural network smoothness constraints %A Mihaela Rosca %A Theophane Weber %A Arthur Gretton %A Shakir Mohamed %B Proceedings on "I Can't Believe It's Not Better!" at NeurIPS Workshops %C Proceedings of Machine Learning Research %D 2020 %E Jessica Zosa Forde %E Francisco Ruiz %E Melanie F. Pradier %E Aaron Schein %F pmlr-v137-rosca20a %I PMLR %P 21--32 %U https://proceedings.mlr.press/v137/rosca20a.html %V 137 %X How sensitive should machine learning models be to input changes? We tackle the question of model smoothness and show that it is a useful inductive bias which aids generalization, adversarial robustness, generative modeling and reinforcement learning. We explore current methods of imposing smoothness constraints and observe they lack the flexibility to adapt to new tasks, they don’t account for data modalities, they interact with losses, architectures and optimization in ways not yet fully understood. We conclude that new advances in the field are hinging on finding ways to incorporate data, tasks and learning into our definitions of smoothness.
APA
Rosca, M., Weber, T., Gretton, A. & Mohamed, S.. (2020). A case for new neural network smoothness constraints. Proceedings on "I Can't Believe It's Not Better!" at NeurIPS Workshops, in Proceedings of Machine Learning Research 137:21-32 Available from https://proceedings.mlr.press/v137/rosca20a.html.

Related Material