Deep Submodular Peripteral Networks
Submodular functions, crucial for various applications, often lack practical learning methods for their acquisition. Seemingly unrelated, learning a scaling from oracles offering graded pairwise preferences (GPC) is underexplored, despite a rich history in psychometrics. In this paper, we introduce...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
12.03.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Submodular functions, crucial for various applications, often lack practical
learning methods for their acquisition. Seemingly unrelated, learning a scaling
from oracles offering graded pairwise preferences (GPC) is underexplored,
despite a rich history in psychometrics. In this paper, we introduce deep
submodular peripteral networks (DSPNs), a novel parametric family of submodular
functions, and methods for their training using a contrastive-learning inspired
GPC-ready strategy to connect and then tackle both of the above challenges. We
introduce newly devised GPC-style "peripteral" loss which leverages numerically
graded relationships between pairs of objects (sets in our case). Unlike
traditional contrastive learning, our method utilizes graded comparisons,
extracting more nuanced information than just binary-outcome comparisons, and
contrasts sets of any size (not just two). We also define a novel suite of
automatic sampling strategies for training, including active-learning inspired
submodular feedback. We demonstrate DSPNs' efficacy in learning submodularity
from a costly target submodular function showing superiority in downstream
tasks such as experimental design and streaming applications. |
---|---|
DOI: | 10.48550/arxiv.2403.08199 |