Tree oblique for regression with weighted support vector machine

This work presents a new approach to learning oblique decision trees for regression tasks. Oblique decision trees are a type of supervised statistical learning technique in which a linear combination of a set of predictors is used to find the hyperplane that partitions the features’ space at each no...

Full description

Saved in:
Bibliographic Details
Published inComputational statistics
Main Authors Carta, Andrea, Frigau, Luca
Format Journal Article
LanguageEnglish
Published 10.07.2025
Online AccessGet full text
ISSN0943-4062
1613-9658
DOI10.1007/s00180-025-01647-w

Cover

Loading…
More Information
Summary:This work presents a new approach to learning oblique decision trees for regression tasks. Oblique decision trees are a type of supervised statistical learning technique in which a linear combination of a set of predictors is used to find the hyperplane that partitions the features’ space at each node. Our novel algorithm, called Tree Oblique for Regression with weighted Support vector machine (TORS), at each node, first applies a feature selection method based on the predictors’ correlation with the dependent variable, and then dichotomizes the continuous dependent variable and applies a weighted support vector machine classifier with linear kernel to discover the oblique hyperplane that minimizes the deviance. We evaluate the performance of TORS on a set of different types of simulated data, and we find out that TORS performs well in any type of dataset. Moreover, we assess its performance by comparing the prediction power in terms of root mean squared error with respect to that obtained by other oblique tree models and standard decision tree, using both simulated and real data. Based on empirical evidence, TORS outperforms the other oblique decision trees and has the additional advantage of being easier to interpret.
ISSN:0943-4062
1613-9658
DOI:10.1007/s00180-025-01647-w