Variation-Bounded Loss for Noise-Tolerant Learning

Authors

  • Jialiang Wang Harbin Institute of Technology City University of Hong Kong
  • Xiong Zhou Harbin Institute of Technology
  • Xianming Liu Harbin Institute of Technology
  • Gangfeng Hu Harbin Institute of Technology
  • Deming Zhai Harbin Institute of Technology
  • Junjun Jiang Harbin Institute of Technology
  • Haoliang Li City University of Hong Kong

DOI:

https://doi.org/10.1609/aaai.v40i31.39829

Abstract

Mitigating the negative impact of noisy labels has been a perennial issue in supervised learning. Robust loss functions have emerged as a prevalent solution to this problem. In this work, we introduce the Variation Ratio as a novel property related to the robustness of loss functions, and propose a new family of robust loss functions, termed Variation-Bounded Loss (VBL), which is characterized by a bounded variation ratio. We provide theoretical analyses of the variation radio, proving that a smaller variation ratio would lead to better robustness. Furthermore, we reveal that the variation ratio provides a feasible method to relax the symmetric condition and offers a more concise path to achieve the asymmetric condition. Based on the variation ratio, we reformulate several commonly used loss functions into a variation-bounded form for pract ical applications. Positive experiments on various datasets exhibit the effectiveness and flexibility of our approach.

Published

2026-03-14

How to Cite

Wang, J., Zhou, X., Liu, X., Hu, G., Zhai, D., Jiang, J., & Li, H. (2026). Variation-Bounded Loss for Noise-Tolerant Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 40(31), 26251–26259. https://doi.org/10.1609/aaai.v40i31.39829

Issue

Section

AAAI Technical Track on Machine Learning VIII