Optimizing working sets for training support vector regressors by Newton's method
In this paper, we train support vector regressors (SVRs) fusing sequential minimal optimization (SMO) and Newton's method. We use the SVR formulation that includes the absolute variables. A partial derivative of the absolute variable with respect to the associated variable is indefinite when th...
Saved in:
Published in | 2015 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 8 |
---|---|
Main Author | |
Format | Conference Proceeding Journal Article |
Language | English |
Published |
IEEE
01.07.2015
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper, we train support vector regressors (SVRs) fusing sequential minimal optimization (SMO) and Newton's method. We use the SVR formulation that includes the absolute variables. A partial derivative of the absolute variable with respect to the associated variable is indefinite when the variable takes on zero. We determine the derivative value according to whether the optimal solution exits in the positive region, negative region, or at zero. In selecting working set, we use the method that we have developed for the SVM, namely, in addition to the pair of variables selected by SMO, loop variables that repeatedly appear in training, are added to the working set. By this method the working set size is automatically determined. We demonstrate the validity of our method over SMO using several benchmark data sets. |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Conference-1 ObjectType-Feature-3 content type line 23 SourceType-Conference Papers & Proceedings-2 |
ISSN: | 2161-4393 2161-4407 |
DOI: | 10.1109/IJCNN.2015.7280309 |