MERCS: Multi-Directional Ensembles of Regression and Classification Trees

Authors

  • Elia Van Wolputte KU Leuven
  • Evgeniya Korneva KU Leuven
  • Hendrik Blockeel KU Leuven

DOI:

https://doi.org/10.1609/aaai.v32i1.11735

Keywords:

Decision Trees, Random Forests, Versatile Models

Abstract

Learning a function f(X) that predicts Y from X is the archetypal Machine Learning (ML) problem. Typically, both sets of attributes (i.e., X,Y) have to be known before a model can be trained. When this is not the case, or when functions f(X) that predict Y from X are needed for varying X and Y, this may introduce significant overhead (separate learning runs for each function). In this paper, we explore the possibility of omitting the specification of X and Y at training time altogether, by learning a multi-directional, or versatile model, which will allow prediction of any Y from any X. Specifically, we introduce a decision tree-based paradigm that generalizes the well-known Random Forests approach to allow for multi-directionality. The result of these efforts is a novel method called MERCS: Multi-directional Ensembles of Regression and Classification treeS. Experiments show the viability of the approach.

Downloads

Published

2018-04-29

How to Cite

Van Wolputte, E., Korneva, E., & Blockeel, H. (2018). MERCS: Multi-Directional Ensembles of Regression and Classification Trees. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11735