Committee polyhedral separability: complexity and polynomial approximation

We consider the minimum affine separating committee (MASC) combinatorial optimization problem, which is related to ensemble machine learning techniques on the class of linear weak classifiers combined by the rule of simple majority. Actually, the MASC problem is a mathematical formalization of the f...

Full description

Saved in:
Bibliographic Details
Published inMachine learning Vol. 101; no. 1-3; pp. 231 - 251
Main Author Khachay, Michael
Format Journal Article
LanguageEnglish
Published New York Springer US 01.10.2015
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0885-6125
1573-0565
DOI10.1007/s10994-015-5505-0

Cover

Loading…
More Information
Summary:We consider the minimum affine separating committee (MASC) combinatorial optimization problem, which is related to ensemble machine learning techniques on the class of linear weak classifiers combined by the rule of simple majority. Actually, the MASC problem is a mathematical formalization of the famous Vapnik–Chervonenkis principle of structural risk minimization in the mentioned class of classifiers. According to this principle, it is required to construct a best performance ensemble classifier belonging to a family of the least possible VC-dimension. It is known that the MASC problem is NP -hard and remains intractable in spaces of any fixed dimension n > 1 even under an additional constraint on the separated sets to be in general position. This special case of the MASC problem called MASC-GP(n) is the main subject of interest of the present paper. To design polynomial-time approximation algorithms for a class of combinatorial optimization problems containing the MASC problem, we propose a new framework, adjusting the well-known Multiple Weights Update method. Following this approach, we construct polynomial-time approximation algorithms with state-of-the-art approximation guarantee for the MASC-GP(n) problem. The results obtained provide a theoretical framework for learning a high-performance ensembles of affine classifiers.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0885-6125
1573-0565
DOI:10.1007/s10994-015-5505-0