An Accelerated Second-Order Method for Distributed Stochastic Optimization
We consider distributed stochastic optimization problems that are solved with master/workers computation architecture. Statistical arguments allow to exploit statistical similarity and approximate this problem by a finite-sum problem, for which we propose an inexact accelerated cubic-regularized New...
Saved in:
Main Authors | , , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
26.03.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We consider distributed stochastic optimization problems that are solved with
master/workers computation architecture. Statistical arguments allow to exploit
statistical similarity and approximate this problem by a finite-sum problem,
for which we propose an inexact accelerated cubic-regularized Newton's method
that achieves lower communication complexity bound for this setting and
improves upon existing upper bound. We further exploit this algorithm to obtain
convergence rate bounds for the original stochastic optimization problem and
compare our bounds with the existing bounds in several regimes when the goal is
to minimize the number of communication rounds and increase the parallelization
by increasing the number of workers. |
---|---|
DOI: | 10.48550/arxiv.2103.14392 |