Fusing Conditional Submodular GAN and Programmatic Weak Supervision

Authors

  • Kumar Shubham Indian Institute of Science, Bangalore, India
  • Pranav Sastry Indian Institute of Science, Bangalore, India
  • Prathosh AP Indian Institute of Science, Bangalore, India

DOI:

https://doi.org/10.1609/aaai.v38i13.29423

Keywords:

ML: Ensemble Methods, ML: Deep Generative Models & Autoencoders, RU: Applications

Abstract

Programmatic Weak Supervision (PWS) and generative models serve as crucial tools that enable researchers to maximize the utility of existing datasets without resorting to laborious data gathering and manual annotation processes. PWS uses various weak supervision techniques to estimate the underlying class labels of data, while generative models primarily concentrate on sampling from the underlying distribution of the given dataset. Although these methods have the potential to complement each other, they have mostly been studied independently. Recently, WSGAN proposed a mechanism to fuse these two models. Their approach utilizes the discrete latent factors of InfoGAN for the training of the label models and leverages the class-dependent information of the label models to generate images of specific classes. However, the disentangled latent factor learned by the InfoGAN may not necessarily be class specific and hence could potentially affect the label model's accuracy. Moreover, the prediction of the label model is often noisy in nature and can have a detrimental impact on the quality of images generated by GAN. In our work, we address these challenges by (i) implementing a noise-aware classifier using the pseudo labels generated by the label model, (ii) utilizing the prediction of the noise-aware classifier for training the label model as well as generation of class-conditioned images. Additionally, We also investigate the effect of training the classifier with a subset of the dataset within a defined uncertainty budget on pseudo labels. We accomplish this by formalizing the subset selection problem as submodular maximization with a knapsack constraint on the entropy of pseudo labels. We conduct experiments on multiple datasets and demonstrate the efficacy of our methods on several tasks vis-a-vis the current state-of-the-art methods. Our implementation is available at https://github.com/kyrs/subpws-gan

Published

2024-03-24

How to Cite

Shubham, K., Sastry, P., & AP, P. (2024). Fusing Conditional Submodular GAN and Programmatic Weak Supervision. Proceedings of the AAAI Conference on Artificial Intelligence, 38(13), 15020-15028. https://doi.org/10.1609/aaai.v38i13.29423

Issue

Section

AAAI Technical Track on Machine Learning IV