An Accelerated Second-Order Method for Distributed Stochastic Optimization

We consider distributed stochastic optimization problems that are solved with master/workers computation architecture. Statistical arguments allow to exploit statistical similarity and approximate this problem by a finite-sum problem, for which we propose an inexact accelerated cubic-regularized New...

Full description

Saved in:
Bibliographic Details
Main Authors Agafonov, Artem, Dvurechensky, Pavel, Scutari, Gesualdo, Gasnikov, Alexander, Kamzolov, Dmitry, Lukashevich, Aleksandr, Daneshmand, Amir
Format Journal Article
LanguageEnglish
Published 26.03.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We consider distributed stochastic optimization problems that are solved with master/workers computation architecture. Statistical arguments allow to exploit statistical similarity and approximate this problem by a finite-sum problem, for which we propose an inexact accelerated cubic-regularized Newton's method that achieves lower communication complexity bound for this setting and improves upon existing upper bound. We further exploit this algorithm to obtain convergence rate bounds for the original stochastic optimization problem and compare our bounds with the existing bounds in several regimes when the goal is to minimize the number of communication rounds and increase the parallelization by increasing the number of workers.
DOI:10.48550/arxiv.2103.14392