A Multi-class Support Vector Machine Based on Geometric Margin Maximization
Support vector machines (SVMs) are popular supervised learning methods. The original SVM was developed for binary classification. It selects a linear classifier by maximizing the geometric margin between the boundary hyperplane and sample examples. There are several extensions of the SVM for multi-c...
Saved in:
Published in | Integrated Uncertainty in Knowledge Modelling and Decision Making pp. 101 - 113 |
---|---|
Main Authors | , |
Format | Book Chapter |
Language | English |
Published |
Cham
Springer International Publishing
|
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Support vector machines (SVMs) are popular supervised learning methods. The original SVM was developed for binary classification. It selects a linear classifier by maximizing the geometric margin between the boundary hyperplane and sample examples. There are several extensions of the SVM for multi-class classification problems. However, they do not maximize geometric margins exactly. Recently, Tatsumi and Tanino have proposed multi-objective multi-class SVM, which simultaneously maximizes the margins for all class pairs. In this paper, we propose another multi-class SVM based on the geometric margin maximization. The SVM is formulated as minimization of the sum of inverse-squared margins for all class pairs. Since this is a nonconvex optimization problem, we propose an approximate solution. By numerical experiments, we show that the propose SVM has better performance in generalization capability than one of the conventional multi-class SVMs. |
---|---|
ISBN: | 9783319754284 3319754289 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-319-75429-1_9 |