On Sparse Optimal Regression Trees

Rafael Blanquero, Emilio Carrizosa, Cristina Molero-Río*, Dolores Romero Morales

*Corresponding author af dette arbejde

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

3 Downloads (Pure)

Abstrakt

In this paper, we model an optimal regression tree through a continuous optimization problem, where a compromise between prediction accuracy and both types of sparsity, namely local and global, is sought. Our approach can accommodate important desirable properties for the regression task, such as cost-sensitivity and fairness. Thanks to the smoothness of the predictions, we can derive local explanations on the continuous predictor variables. The computational experience reported shows the outperformance of our approach in terms of prediction accuracy against standard benchmark regression methods such as CART, OLS and LASSO. Moreover, the scalability of our approach with respect to the size of the training sample is illustrated.
OriginalsprogEngelsk
TidsskriftEuropean Journal of Operational Research
Vol/bind299
Udgave nummer3
Sider (fra-til)1045-1054
Antal sider10
ISSN0377-2217
DOI
StatusUdgivet - jun. 2022

Bibliografisk note

Published online: 18 December 2021.

Emneord

  • Machine learning
  • Classification and regression trees
  • Optimal regression trees
  • Sparsity
  • Nonlinear programming

Citationsformater