Set2Graph: Learning Graphs From Sets

Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions. Examples include clustering, learning vertex and edge features on graphs, and learning features on triplets in a collection. A natural approach fo...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Hadar Serviansky, Segol, Nimrod, Shlomi, Jonathan, Cranmer, Kyle, Gross, Eilam, Maron, Haggai, Lipman, Yaron
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 26.11.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions. Examples include clustering, learning vertex and edge features on graphs, and learning features on triplets in a collection. A natural approach for building Set2Graph models is to characterize all linear equivariant set-to-hypergraph layers and stack them with non-linear activations. This poses two challenges: (i) the expressive power of these networks is not well understood; and (ii) these models would suffer from high, often intractable computational and memory complexity, as their dimension grows exponentially. This paper advocates a family of neural network models for learning Set2Graph functions that is both practical and of maximal expressive power (universal), that is, can approximate arbitrary continuous Set2Graph functions over compact sets. Testing these models on different machine learning tasks, mainly an application to particle physics, we find them favorable to existing baselines.
ISSN:2331-8422