Tree Learning: Optimal Sample Complexity and Algorithms

Authors

  • Dmitrii Avdiukhin Indiana University, Bloomington
  • Grigory Yaroslavtsev George Mason University
  • Danny Vainstein Tel-Aviv University
  • Orr Fischer Weizmann Institute of Science
  • Sauman Das Thomas Jefferson High School for Science and Technology
  • Faraz Mirza Thomas Jefferson High School for Science and Technology

DOI:

https://doi.org/10.1609/aaai.v37i6.25822

Keywords:

ML: Clustering, ML: Learning Theory, ML: Relational Learning

Abstract

We study the problem of learning a hierarchical tree representation of data from labeled samples, taken from an arbitrary (and possibly adversarial) distribution. Consider a collection of data tuples labeled according to their hierarchical structure. The smallest number of such tuples required in order to be able to accurately label subsequent tuples is of interest for data collection in machine learning. We present optimal sample complexity bounds for this problem in several learning settings, including (agnostic) PAC learning and online learning. Our results are based on tight bounds of the Natarajan and Littlestone dimensions of the associated problem. The corresponding tree classifiers can be constructed efficiently in near-linear time.

Downloads

Published

2023-06-26

How to Cite

Avdiukhin, D., Yaroslavtsev, G., Vainstein, D., Fischer, O., Das, S., & Mirza, F. (2023). Tree Learning: Optimal Sample Complexity and Algorithms. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 6701-6708. https://doi.org/10.1609/aaai.v37i6.25822

Issue

Section

AAAI Technical Track on Machine Learning I