Delving into Sample Loss Curve to Embrace Noisy and Imbalanced Data

Authors

  • Shenwang Jiang Beijing Institute of Technology
  • Jianan Li Beijing Institute of Technology
  • Ying Wang Beijing Institute of Technology
  • Bo Huang Beijing Institute of Technology
  • Zhang Zhang University of Massachusetts Lowell
  • Tingfa Xu Beijing Institute of Technology

DOI:

https://doi.org/10.1609/aaai.v36i6.20661

Keywords:

Machine Learning (ML)

Abstract

Corrupted labels and class imbalance are commonly encountered in practically collected training data, which easily leads to over-fitting of deep neural networks (DNNs). Existing approaches alleviate these issues by adopting a sample re-weighting strategy, which is to re-weight sample by designing weighting function. However, it is only applicable for training data containing only either one type of data biases. In practice, however, biased samples with corrupted labels and of tailed classes commonly co-exist in training data. How to handle them simultaneously is a key but under-explored problem. In this paper, we find that these two types of biased samples, though have similar transient loss, have distinguishable trend and characteristics in loss curves, which could provide valuable priors for sample weight assignment. Motivated by this, we delve into the loss curves and propose a novel probe-and-allocate training strategy: In the probing stage, we train the network on the whole biased training data without intervention, and record the loss curve of each sample as an additional attribute; In the allocating stage, we feed the resulting attribute to a newly designed curve-perception network, named CurveNet, to learn to identify the bias type of each sample and assign proper weights through meta-learning adaptively. The training speed of meta learning also blocks its application. To solve it, we propose a method named skip layer meta optimization (SLMO) to accelerate training speed by skipping the bottom layers. Extensive synthetic and real experiments well validate the proposed method, which achieves state-of-the-art performance on multiple challenging benchmarks.

Downloads

Published

2022-06-28

How to Cite

Jiang, S., Li, J., Wang, Y., Huang, B., Zhang, Z., & Xu, T. (2022). Delving into Sample Loss Curve to Embrace Noisy and Imbalanced Data. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 7024-7032. https://doi.org/10.1609/aaai.v36i6.20661

Issue

Section

AAAI Technical Track on Machine Learning I