Hypergraph convolution and hypergraph attention

•Hypergraph convolution defines a basic convolutional operator in a hypergraph. It enables an efficient information propagation between vertices by fully exploiting the high order relationship and local clustering structure therein. We mathematically prove that graph convolution is a special case of...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition Vol. 110; p. 107637
Main Authors Bai, Song, Zhang, Feihu, Torr, Philip H.S.
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.02.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•Hypergraph convolution defines a basic convolutional operator in a hypergraph. It enables an efficient information propagation between vertices by fully exploiting the high order relationship and local clustering structure therein. We mathematically prove that graph convolution is a special case of hypergraph convolution when the non pairwise relationship degenerates to a pairwise one.•Apart from hypergraph convolution where the underlying structure used for propagation is pre defined, hypergraph attention further exerts an attention mechanism to learn a dynamic connection of hyperedges. Then, the information propagation and gathering is done in task relevant parts of the graph, thereby generating more discriminative node embeddings.•Both hypergraph convolution and hypergraph attention are end to end trainable, and can be inserted into most variants of graph neural networks as long as non pairwise relationships are observed. Extensive experimental results on benchmark datasets demonstrate the efficacy of the proposed methods for semi supervised node classification. Recently, graph neural networks have attracted great attention and achieved prominent performance in various research fields. Most of those algorithms have assumed pairwise relationships of objects of interest. However, in many real applications, the relationships between objects are in higher-order, beyond a pairwise formulation. To efficiently learn deep embeddings on the high-order graph-structured data, we introduce two end-to-end trainable operators to the family of graph neural networks, i.e., hypergraph convolution and hypergraph attention. Whilst hypergraph convolution defines the basic formulation of performing convolution on a hypergraph, hypergraph attention further enhances the capacity of representation learning by leveraging an attention module. With the two operators, a graph neural network is readily extended to a more flexible model and applied to diverse applications where non-pairwise relationships are observed. Extensive experimental results with semi-supervised node classification demonstrate the effectiveness of hypergraph convolution and hypergraph attention.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2020.107637