A Multi-class Support Vector Machine Based on Geometric Margin Maximization

Support vector machines (SVMs) are popular supervised learning methods. The original SVM was developed for binary classification. It selects a linear classifier by maximizing the geometric margin between the boundary hyperplane and sample examples. There are several extensions of the SVM for multi-c...

Full description

Saved in:
Bibliographic Details
Published inIntegrated Uncertainty in Knowledge Modelling and Decision Making pp. 101 - 113
Main Authors Kusunoki, Yoshifumi, Tatsumi, Keiji
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Support vector machines (SVMs) are popular supervised learning methods. The original SVM was developed for binary classification. It selects a linear classifier by maximizing the geometric margin between the boundary hyperplane and sample examples. There are several extensions of the SVM for multi-class classification problems. However, they do not maximize geometric margins exactly. Recently, Tatsumi and Tanino have proposed multi-objective multi-class SVM, which simultaneously maximizes the margins for all class pairs. In this paper, we propose another multi-class SVM based on the geometric margin maximization. The SVM is formulated as minimization of the sum of inverse-squared margins for all class pairs. Since this is a nonconvex optimization problem, we propose an approximate solution. By numerical experiments, we show that the propose SVM has better performance in generalization capability than one of the conventional multi-class SVMs.
ISBN:9783319754284
3319754289
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-75429-1_9