High Probability Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise
In this work we study high probability bounds for stochastic subgradient methods under heavy tailed noise. In this setting the noise is only assumed to have finite variance as opposed to a sub-Gaussian distribution for which it is known that standard subgradient methods enjoys high probability bound...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
17.08.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this work we study high probability bounds for stochastic subgradient
methods under heavy tailed noise. In this setting the noise is only assumed to
have finite variance as opposed to a sub-Gaussian distribution for which it is
known that standard subgradient methods enjoys high probability bounds. We
analyzed a clipped version of the projected stochastic subgradient method,
where subgradient estimates are truncated whenever they have large norms. We
show that this clipping strategy leads both to near optimal any-time and finite
horizon bounds for many classical averaging schemes. Preliminary experiments
are shown to support the validity of the method. |
---|---|
DOI: | 10.48550/arxiv.2208.08567 |