Conic Relaxations for Semi-supervised Support Vector Machines

Semi-supervised support vector machines arise in machine learning as a model of mixed integer programming problem for classification. In this paper, we propose two convex conic relaxations for the original mixed integer programming problem. The first one is a new semi-definite relaxation, and its po...

Full description

Saved in:
Bibliographic Details
Published inJournal of optimization theory and applications Vol. 169; no. 1; pp. 299 - 313
Main Authors Bai, Yanqin, Yan, Xin
Format Journal Article
LanguageEnglish
Published New York Springer US 01.04.2016
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Semi-supervised support vector machines arise in machine learning as a model of mixed integer programming problem for classification. In this paper, we propose two convex conic relaxations for the original mixed integer programming problem. The first one is a new semi-definite relaxation, and its possibly maximal ratio of the optimal value is estimated approximately. The second one is a doubly nonnegative relaxation, which is relaxed from a well-known conic programming problem called completely positive programming problem that is equivalent to the original problem. Furthermore, we prove that the doubly nonnegative relaxation is tighter than the semi-definite relaxation. Finally, the numerical results show that two proposed relaxations not only generate proper classifiers but also outperform some existing methods in classification accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0022-3239
1573-2878
DOI:10.1007/s10957-015-0843-4