A Bayesian graph structure inference neural network based on adaptive connection sampling
Graph Neural Networks (GNNs) have drawn a lot of interest recently and excel in several areas, including node categorization, recommended systems, link prediction, etc. However, most GNNs by default observe graphs that accurately reflect the relationships between nodes. The feature aggregation of GN...
Saved in:
Published in | Applied soft computing Vol. 175; p. 113018 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.05.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Graph Neural Networks (GNNs) have drawn a lot of interest recently and excel in several areas, including node categorization, recommended systems, link prediction, etc. However, most GNNs by default observe graphs that accurately reflect the relationships between nodes. The feature aggregation of GNN is done by aggregating the neighbor nodes of the node. Therefore, observation graphs are not always compatible with the properties of GNNs. Unlike random regularization techniques that employ constant sampling rates or manually tune them as model hyperparameters. This study proposes a graph-structure learning network based on adaptive connection sampling. The core idea is to use the features generated by each layer of GNNs through adaptive sampling to generate a graph through the Bayesian method and realize the joint optimization of graph structure and adaptive connection sampling through iteration. This study conducts experiments on the data set to evaluate the effectiveness of this method. In the node classification task, the model improves performance by about 3.8% compared to the average of many baselines. It can be seen that learning graph structures is effective and inferring graphs is logical.
•ASBGIN generates graph structures that conform to graph neural network aggregation.•ASBGIN considers the high-order neighborhood features when inferring graph structure.•ASBGIN mitigates GNN over-smoothing/overfitting without a fixed Dropout Rate. |
---|---|
ISSN: | 1568-4946 |
DOI: | 10.1016/j.asoc.2025.113018 |