Fast Parallel Tensor Times Same Vector for Hypergraphs
Hypergraphs are a popular paradigm to represent complex real-world networks exhibiting multi-way relationships of varying sizes. Mining centrality in hypergraphs via symmetric adjacency tensors has only recently become computationally feasible for large and complex datasets. To enable scalable compu...
Saved in:
Main Authors | , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
14.11.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Hypergraphs are a popular paradigm to represent complex real-world networks
exhibiting multi-way relationships of varying sizes. Mining centrality in
hypergraphs via symmetric adjacency tensors has only recently become
computationally feasible for large and complex datasets. To enable scalable
computation of these and related hypergraph analytics, here we focus on the
Sparse Symmetric Tensor Times Same Vector (S$^3$TTVc) operation. We introduce
the Compound Compressed Sparse Symmetric (CCSS) format, an extension of the
compact CSS format for hypergraphs of varying hyperedge sizes and present a
shared-memory parallel algorithm to compute S$^3$TTVc. We experimentally show
S$^3$TTVc computation using the CCSS format achieves better performance than
the naive baseline, and is subsequently more performant for hypergraph
$H$-eigenvector centrality. |
---|---|
DOI: | 10.48550/arxiv.2311.08595 |