Trainable Undersampling for Class-Imbalance Learning

Authors

  • Minlong Peng Fudan University
  • Qi Zhang Fudan University
  • Xiaoyu Xing Fudan University
  • Tao Gui Fudan University
  • Xuanjing Huang Fudan University
  • Yu-Gang Jiang Fudan University
  • Keyu Ding iFLYTEK Co., Ltd
  • Zhigang Chen iFLYTEK Co., Ltd

DOI:

https://doi.org/10.1609/aaai.v33i01.33014707

Abstract

Undersampling has been widely used in the class-imbalance learning area. The main deficiency of most existing undersampling methods is that their data sampling strategies are heuristic-based and independent of the used classifier and evaluation metric. Thus, they may discard informative instances for the classifier during the data sampling. In this work, we propose a meta-learning method built on the undersampling to address this issue. The key idea of this method is to parametrize the data sampler and train it to optimize the classification performance over the evaluation metric. We solve the non-differentiable optimization problem for training the data sampler via reinforcement learning. By incorporating evaluation metric optimization into the data sampling process, the proposed method can learn which instance should be discarded for the given classifier and evaluation metric. In addition, as a data level operation, this method can be easily applied to arbitrary evaluation metric and classifier, including non-parametric ones (e.g., C4.5 and KNN). Experimental results on both synthetic and realistic datasets demonstrate the effectiveness of the proposed method.

Downloads

Published

2019-07-17

How to Cite

Peng, M., Zhang, Q., Xing, X., Gui, T., Huang, X., Jiang, Y.-G., Ding, K., & Chen, Z. (2019). Trainable Undersampling for Class-Imbalance Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4707-4714. https://doi.org/10.1609/aaai.v33i01.33014707

Issue

Section

AAAI Technical Track: Machine Learning