Optimization over time-varying directed graphs with row and column-stochastic matrices

In this paper, we provide a distributed optimization algorithm, termed as TV-$\mathcal{AB}$, that minimizes a sum of convex functions over time-varying, random directed graphs. Contrary to the existing work, the algorithm we propose does not require eigenvector estimation to estimate the (non-$\math...

Full description

Saved in:
Bibliographic Details
Main Authors Saadatniaki, Fakhteh, Xin, Ran, Khan, Usman A
Format Journal Article
LanguageEnglish
Published 17.10.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we provide a distributed optimization algorithm, termed as TV-$\mathcal{AB}$, that minimizes a sum of convex functions over time-varying, random directed graphs. Contrary to the existing work, the algorithm we propose does not require eigenvector estimation to estimate the (non-$\mathbf{1}$) Perron eigenvector of a stochastic matrix. Instead, the proposed approach relies on a novel information mixing approach that exploits both row- and column-stochastic weights to achieve agreement towards the optimal solution when the underlying graph is directed. We show that TV-$\mathcal{AB}$ converges linearly to the optimal solution when the global objective is smooth and strongly-convex, and the underlying time-varying graphs exhibit bounded connectivity, i.e., a union of every $C$ consecutive graphs is strongly-connected. We derive the convergence results based on the stability analysis of a linear system of inequalities along with a matrix perturbation argument. Simulations confirm the findings in this paper.
DOI:10.48550/arxiv.1810.07393