We study the generalization properties of stochastic gradient methods for learning with convex loss functions and linearly parameterized functions. We show that, in the absence of penalizations or constraints, the stability and approximation properties of the algorithm can be controlled by tuning either the step-size or the number of passes over the data. In this view, these parameters can be seen to control a form of implicit regularization. Numerical results complement the theoretical findings.

Generalization properties and implicit regularization for multiple passes SGM / Lin, J; Camoriano, R; Rosasco, L. - ELETTRONICO. - 48:(2016), pp. 2340-2348. (Intervento presentato al convegno 33rd International Conference on Machine Learning tenutosi a New York, USA nel June 19 - 24, 2016).

Generalization properties and implicit regularization for multiple passes SGM

Camoriano R;
2016

Abstract

We study the generalization properties of stochastic gradient methods for learning with convex loss functions and linearly parameterized functions. We show that, in the absence of penalizations or constraints, the stability and approximation properties of the algorithm can be controlled by tuning either the step-size or the number of passes over the data. In this view, these parameters can be seen to control a form of implicit regularization. Numerical results complement the theoretical findings.
File in questo prodotto:
File Dimensione Formato  
lina16.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 378.13 kB
Formato Adobe PDF
378.13 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2982145