Optimal Decision Trees for Nonlinear Metrics

Authors

  • Emir Demirović Delft University of Technology
  • Peter J. Stuckey Monash University Data61

Keywords:

Search, Optimization, Constraint Optimization

Abstract

Nonlinear metrics, such as the F1-score, Matthews correlation coefficient, and Fowlkes–Mallows index, are often used to evaluate the performance of machine learning models, in particular, when facing imbalanced datasets that contain more samples of one class than the other. Recent optimal decision tree algorithms have shown remarkable progress in producing trees that are optimal with respect to linear criteria, such as accuracy, but unfortunately nonlinear metrics remain a challenge. To address this gap, we propose a novel algorithm based on bi-objective optimisation, which treats misclassifications of each binary class as a separate objective. We show that, for a large class of metrics, the optimal tree lies on the Pareto frontier. Consequently, we obtain the optimal tree by using our method to generate the set of all nondominated trees. To the best of our knowledge, this is the first method to compute provably optimal decision trees for nonlinear metrics. Our approach leads to a trade-off when compared to optimising linear metrics: the resulting trees may be more desirable according to the given nonlinear metric at the expense of higher runtimes. Nevertheless, the experiments illustrate that runtimes are reasonable for majority of the tested datasets.

Downloads

Published

2021-05-18

How to Cite

Demirović, E., & Stuckey, P. J. (2021). Optimal Decision Trees for Nonlinear Metrics. Proceedings of the AAAI Conference on Artificial Intelligence, 35(5), 3733-3741. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/16490

Issue

Section

AAAI Technical Track on Constraint Satisfaction and Optimization