Stability of Graph Convolutional Neural Networks through the lens of small perturbation analysis

In this work, we study the problem of stability of Graph Convolutional Neural Networks (GCNs) under random small perturbations in the underlying graph topology, i.e. under a limited number of insertions or deletions of edges. We derive a novel bound on the expected difference between the outputs of...

Full description

Saved in:
Bibliographic Details
Main Authors Testa, Lucia, Battiloro, Claudio, Sardellitti, Stefania, Barbarossa, Sergio
Format Journal Article
LanguageEnglish
Published 20.12.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this work, we study the problem of stability of Graph Convolutional Neural Networks (GCNs) under random small perturbations in the underlying graph topology, i.e. under a limited number of insertions or deletions of edges. We derive a novel bound on the expected difference between the outputs of unperturbed and perturbed GCNs. The proposed bound explicitly depends on the magnitude of the perturbation of the eigenpairs of the Laplacian matrix, and the perturbation explicitly depends on which edges are inserted or deleted. Then, we provide a quantitative characterization of the effect of perturbing specific edges on the stability of the network. We leverage tools from small perturbation analysis to express the bounds in closed, albeit approximate, form, in order to enhance interpretability of the results, without the need to compute any perturbed shift operator. Finally, we numerically evaluate the effectiveness of the proposed bound.
DOI:10.48550/arxiv.2312.12934