Generalization Properties and Implicit Regularization for Multiple Passes SGM

Junhong Lin, Raffaello Camoriano, Lorenzo Rosasco
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2340-2348, 2016.

Abstract

We study the generalization properties of stochastic gradient methods for learning with convex loss functions and linearly parameterized functions. We show that, in the absence of penalizations or constraints, the stability and approximation properties of the algorithm can be controlled by tuning either the step-size or the number of passes over the data. In this view, these parameters can be seen to control a form of implicit regularization. Numerical results complement the theoretical findings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-lina16, title = {Generalization Properties and Implicit Regularization for Multiple Passes SGM}, author = {Lin, Junhong and Camoriano, Raffaello and Rosasco, Lorenzo}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {2340--2348}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/lina16.pdf}, url = {https://proceedings.mlr.press/v48/lina16.html}, abstract = {We study the generalization properties of stochastic gradient methods for learning with convex loss functions and linearly parameterized functions. We show that, in the absence of penalizations or constraints, the stability and approximation properties of the algorithm can be controlled by tuning either the step-size or the number of passes over the data. In this view, these parameters can be seen to control a form of implicit regularization. Numerical results complement the theoretical findings.} }
Endnote
%0 Conference Paper %T Generalization Properties and Implicit Regularization for Multiple Passes SGM %A Junhong Lin %A Raffaello Camoriano %A Lorenzo Rosasco %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-lina16 %I PMLR %P 2340--2348 %U https://proceedings.mlr.press/v48/lina16.html %V 48 %X We study the generalization properties of stochastic gradient methods for learning with convex loss functions and linearly parameterized functions. We show that, in the absence of penalizations or constraints, the stability and approximation properties of the algorithm can be controlled by tuning either the step-size or the number of passes over the data. In this view, these parameters can be seen to control a form of implicit regularization. Numerical results complement the theoretical findings.
RIS
TY - CPAPER TI - Generalization Properties and Implicit Regularization for Multiple Passes SGM AU - Junhong Lin AU - Raffaello Camoriano AU - Lorenzo Rosasco BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-lina16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 2340 EP - 2348 L1 - http://proceedings.mlr.press/v48/lina16.pdf UR - https://proceedings.mlr.press/v48/lina16.html AB - We study the generalization properties of stochastic gradient methods for learning with convex loss functions and linearly parameterized functions. We show that, in the absence of penalizations or constraints, the stability and approximation properties of the algorithm can be controlled by tuning either the step-size or the number of passes over the data. In this view, these parameters can be seen to control a form of implicit regularization. Numerical results complement the theoretical findings. ER -
APA
Lin, J., Camoriano, R. & Rosasco, L.. (2016). Generalization Properties and Implicit Regularization for Multiple Passes SGM. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:2340-2348 Available from https://proceedings.mlr.press/v48/lina16.html.

Related Material