Semi-Supervised Learning with Variational Bayesian Inference and Maximum Uncertainty Regularization

Authors

  • Kien Do Deakin University
  • Truyen Tran Deakin University
  • Svetha Venkatesh Deakin University

Keywords:

Semi-Supervised Learning

Abstract

We propose two generic methods for improving semi-supervised learning (SSL). The first integrates weight perturbation (WP) into existing “consistency regularization” (CR) based methods. We implement WP by leveraging variational Bayesian inference (VBI). The second method proposes a novel consistency loss called “maximum uncertainty regularization” (MUR). While most consistency losses act on perturbations in the vicinity of each data point, MUR actively searches for “virtual” points situated beyond this region that cause the most uncertain class predictions. This allows MUR to impose smoothness on a wider area in the input-output manifold. Our experiments show clear improvements in classification errors of various CR based methods when they are combined with VBI or MUR or both.

Downloads

Published

2021-05-18

How to Cite

Do, K., Tran, T., & Venkatesh, S. (2021). Semi-Supervised Learning with Variational Bayesian Inference and Maximum Uncertainty Regularization. Proceedings of the AAAI Conference on Artificial Intelligence, 35(8), 7236-7244. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/16889

Issue

Section

AAAI Technical Track on Machine Learning I