High Probability Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

In this work we study high probability bounds for stochastic subgradient methods under heavy tailed noise. In this setting the noise is only assumed to have finite variance as opposed to a sub-Gaussian distribution for which it is known that standard subgradient methods enjoys high probability bound...

Full description

Saved in:
Bibliographic Details
Main Authors Parletta, Daniela A, Paudice, Andrea, Pontil, Massimiliano, Salzo, Saverio
Format Journal Article
LanguageEnglish
Published 17.08.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this work we study high probability bounds for stochastic subgradient methods under heavy tailed noise. In this setting the noise is only assumed to have finite variance as opposed to a sub-Gaussian distribution for which it is known that standard subgradient methods enjoys high probability bounds. We analyzed a clipped version of the projected stochastic subgradient method, where subgradient estimates are truncated whenever they have large norms. We show that this clipping strategy leads both to near optimal any-time and finite horizon bounds for many classical averaging schemes. Preliminary experiments are shown to support the validity of the method.
DOI:10.48550/arxiv.2208.08567