Feature-Cost Sensitive Learning with Submodular Trees of Classifiers

Authors

  • Matt Kusner Washington University in St. Louis
  • Wenlin Chen Washington University in St. Louis
  • Quan Zhou Tsinghua University
  • Zhixiang (Eddie) Xu Washington University in St. Louis
  • Kilian Weinberger Washington University in St. Louis
  • Yixin Chen Washington University in St. Louis

DOI:

https://doi.org/10.1609/aaai.v28i1.8967

Keywords:

submodular optimization, feature-cost sensitive learning, budgeted learning, tree-based learning

Abstract

During the past decade, machine learning algorithms have become commonplace in large-scale real-world industrial applications. In these settings, the computation time to train and test machine learning algorithms is a key consideration. At training-time the algorithms must scale to very large data set sizes.At testing-time, the cost of feature extraction can dominate the CPU runtime. Recently, a promising method was proposed to account for the feature extraction cost at testing time, called Cost-sensitive Tree of Classifiers (CSTC). Although the CSTC problem is NP-hard, the authors suggest an approximation through a mixed-norm relaxation across many classifiers. This relaxation is slow to train and requires involved optimization hyperparameter tuning. We propose a different relaxation using approximate submodularity, called Approximately Submodular Tree of Classifiers (ASTC). ASTC is much simpler to implement, yields equivalent results but requires no optimization hyperparameter tuning and is up to two orders of magnitude faster to train.

Downloads

Published

2014-06-21

How to Cite

Kusner, M., Chen, W., Zhou, Q., Xu, Z. (Eddie), Weinberger, K., & Chen, Y. (2014). Feature-Cost Sensitive Learning with Submodular Trees of Classifiers. Proceedings of the AAAI Conference on Artificial Intelligence, 28(1). https://doi.org/10.1609/aaai.v28i1.8967

Issue

Section

Main Track: Novel Machine Learning Algorithms