Optimal Sparse Recovery with Decision Stumps

Authors

  • Kiarash Banihashem University of Maryland
  • Mohammad Hajiaghayi University of Maryland
  • Max Springer University of Maryland

DOI:

https://doi.org/10.1609/aaai.v37i6.25827

Keywords:

ML: Dimensionality Reduction/Feature Selection, ML: Classification and Regression

Abstract

Decision trees are widely used for their low computational cost, good predictive performance, and ability to assess the importance of features. Though often used in practice for feature selection, the theoretical guarantees of these methods are not well understood. We here obtain a tight finite sample bound for the feature selection problem in linear regression using single-depth decision trees. We examine the statistical properties of these "decision stumps" for the recovery of the s active features from p total features, where s << p. Our analysis provides tight sample performance guarantees on high-dimensional sparse systems which align with the finite sample bound of O(s log p) as obtained by Lasso, improving upon previous bounds for both the median and optimal splitting criteria. Our results extend to the non-linear regime as well as arbitrary sub-Gaussian distributions, demonstrating that tree based methods attain strong feature selection properties under a wide variety of settings and further shedding light on the success of these methods in practice. As a byproduct of our analysis, we show that we can provably guarantee recovery even when the number of active features s is unknown. We further validate our theoretical results and proof methodology using computational experiments.

Downloads

Published

2023-06-26

How to Cite

Banihashem, K., Hajiaghayi, M., & Springer, M. (2023). Optimal Sparse Recovery with Decision Stumps. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 6745-6752. https://doi.org/10.1609/aaai.v37i6.25827

Issue

Section

AAAI Technical Track on Machine Learning I