Scalable Optimal Multiway-Split Decision Trees with Constraints

Authors

  • Shivaram Subramanian IBM Research
  • Wei Sun IBM Research

DOI:

https://doi.org/10.1609/aaai.v37i8.26180

Keywords:

ML: Classification and Regression, ML: Applications, ML: Optimization

Abstract

There has been a surge of interest in learning optimal decision trees using mixed-integer programs (MIP) in recent years, as heuristic-based methods do not guarantee optimality and find it challenging to incorporate constraints that are critical for many practical applications. However, existing MIP methods that build on an arc-based formulation do not scale well as the number of binary variables is in the order of 2 to the power of the depth of the tree and the size of the dataset. Moreover, they can only handle sample-level constraints and linear metrics. In this paper, we propose a novel path-based MIP formulation where the number of decision variables is independent of dataset size. We present a scalable column generation framework to solve the MIP. Our framework produces a multiway-split tree which is more interpretable than the typical binary-split trees due to its shorter rules. Our framework is more general as it can handle nonlinear metrics such as F1 score, and incorporate a broader class of constraints. We demonstrate its efficacy with extensive experiments. We present results on datasets containing up to 1,008,372 samples while existing MIP-based decision tree models do not scale well on data beyond a few thousand points. We report superior or competitive results compared to the state-of-art MIP-based methods with up to a 24X reduction in runtime.

Downloads

Published

2023-06-26

How to Cite

Subramanian, S., & Sun, W. (2023). Scalable Optimal Multiway-Split Decision Trees with Constraints. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9891-9899. https://doi.org/10.1609/aaai.v37i8.26180

Issue

Section

AAAI Technical Track on Machine Learning III