Robust descent using smoothed multiplicative noise

Matthew J. Holland
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:703-711, 2019.

Abstract

In this work, we propose a novel robust gradient descent procedure which makes use of a smoothed multiplicative noise applied directly to observations before constructing a sum of soft-truncated gradient coordinates. We show that the procedure has competitive theoretical guarantees, with the major advantage of a simple implementation that does not require an iterative sub-routine for robustification. Empirical tests reinforce the theory, showing more efficient generalization over a much wider class of data distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-holland19a, title = {Robust descent using smoothed multiplicative noise}, author = {Holland, Matthew J.}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {703--711}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/holland19a/holland19a.pdf}, url = {https://proceedings.mlr.press/v89/holland19a.html}, abstract = {In this work, we propose a novel robust gradient descent procedure which makes use of a smoothed multiplicative noise applied directly to observations before constructing a sum of soft-truncated gradient coordinates. We show that the procedure has competitive theoretical guarantees, with the major advantage of a simple implementation that does not require an iterative sub-routine for robustification. Empirical tests reinforce the theory, showing more efficient generalization over a much wider class of data distributions.} }
Endnote
%0 Conference Paper %T Robust descent using smoothed multiplicative noise %A Matthew J. Holland %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-holland19a %I PMLR %P 703--711 %U https://proceedings.mlr.press/v89/holland19a.html %V 89 %X In this work, we propose a novel robust gradient descent procedure which makes use of a smoothed multiplicative noise applied directly to observations before constructing a sum of soft-truncated gradient coordinates. We show that the procedure has competitive theoretical guarantees, with the major advantage of a simple implementation that does not require an iterative sub-routine for robustification. Empirical tests reinforce the theory, showing more efficient generalization over a much wider class of data distributions.
APA
Holland, M.J.. (2019). Robust descent using smoothed multiplicative noise. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:703-711 Available from https://proceedings.mlr.press/v89/holland19a.html.

Related Material