On Sparse Optimal Regression Trees

Rafael Blanquero, Emilio Carrizosa, Cristina Molero-Río*, Dolores Romero Morales

*Corresponding author af dette arbejde

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

40 Downloads (Pure)


In this paper, we model an optimal regression tree through a continuous optimization problem, where a compromise between prediction accuracy and both types of sparsity, namely local and global, is sought. Our approach can accommodate important desirable properties for the regression task, such as cost-sensitivity and fairness. Thanks to the smoothness of the predictions, we can derive local explanations on the continuous predictor variables. The computational experience reported shows the outperformance of our approach in terms of prediction accuracy against standard benchmark regression methods such as CART, OLS and LASSO. Moreover, the scalability of our approach with respect to the size of the training sample is illustrated.
TidsskriftEuropean Journal of Operational Research
Udgave nummer3
Sider (fra-til)1045-1054
Antal sider10
StatusUdgivet - jun. 2022


  • Machine learning
  • Classification and regression trees
  • Optimal regression trees
  • Sparsity
  • Nonlinear programming