Near-Tight Margin-Based Generalization Bounds for Support Vector Machines
Support Vector Machines (SVMs) are among the most fundamental tools for binary classification. In its simplest formulation, an SVM produces a hyperplane separating two classes of data using the largest possible margin to the data. The focus on maximizing the margin has been well motivated through nu...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
03.06.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Support Vector Machines (SVMs) are among the most fundamental tools for
binary classification. In its simplest formulation, an SVM produces a
hyperplane separating two classes of data using the largest possible margin to
the data. The focus on maximizing the margin has been well motivated through
numerous generalization bounds. In this paper, we revisit and improve the
classic generalization bounds in terms of margins. Furthermore, we complement
our new generalization bound by a nearly matching lower bound, thus almost
settling the generalization performance of SVMs in terms of margins. |
---|---|
DOI: | 10.48550/arxiv.2006.02175 |