Tightening big Ms in integer programming formulations for support vector machines with ramp loss

•Support vector machines with ramp loss models are analyzed.•Strategies for tightening the values of big M parameters of the model are proposed.•The strategies are focused on ℓ1-norm and ℓ2-norm cases.•Linear, quadratic and Lagragian relaxation models are solved in the strategies.•The performance of...

Full description

Saved in:
Bibliographic Details
Published inEuropean journal of operational research Vol. 286; no. 1; pp. 84 - 100
Main Authors Baldomero-Naranjo, Marta, Martínez-Merino, Luisa I., Rodríguez-Chía, Antonio M.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.10.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•Support vector machines with ramp loss models are analyzed.•Strategies for tightening the values of big M parameters of the model are proposed.•The strategies are focused on ℓ1-norm and ℓ2-norm cases.•Linear, quadratic and Lagragian relaxation models are solved in the strategies.•The performance of these strategies are tested in simulated and real-life datasets. This paper considers various models of support vector machines with ramp loss, these being an efficient and robust tool in supervised classification for the detection of outliers. The exact solution approaches for the resulting optimization problem are of high demand for large datasets. Hence, the goal of this paper is to develop algorithms that provide efficient methodologies to exactly solve these optimization problems. These approaches are based on three strategies for obtaining tightened values of the big M parameters included in the formulation of the problem. Two of them require solving a sequence of continuous problems, while the third uses the Lagrangian relaxation to tighten the bounds. The proposed resolution methods are valid for the ℓ1-norm and ℓ2-norm ramp loss formulations. They were tested and compared with existing solution methods in simulated and real-life datasets, showing the efficiency of the developed methodology.
ISSN:0377-2217
1872-6860
DOI:10.1016/j.ejor.2020.03.023