HiABP: Hierarchical Initialized ABP for Unsupervised Representation Learning

Authors

  • Jiankai Sun Centre for Perceptual and Interactive Intelligence The Chinese University of Hong Kong
  • Rui Liu The Chinese University of Hong Kong
  • Bolei Zhou Centre for Perceptual and Interactive Intelligence The Chinese University of Hong Kong

Keywords:

Representation Learning, Unsupervised & Self-Supervised Learning, Applications

Abstract

Although Markov chain Monte Carlo (MCMC) is useful for generating samples from the posterior distribution, it often suffers from intractability when dealing with large-scale datasets. To address this issue, we propose Hierarchical Initialized Alternating Back-propagation (HiABP) for efficient Bayesian inference. Especially, we endow Alternating Backpropagation (ABP) method with a well-designed initializer and hierarchical structure, composing the pipeline of Initializing, Improving, and Learning back-propagation. It saves much time for the generative model to initialize the latent variable by constraining a sampler to be close to the true posterior distribution. The initialized latent variable is then improved significantly by an MCMC sampler. Thus the proposed method has the strengths of both methods, i.e., the effectiveness of MCMC and the efficiency of variational inference. Experimental results validate our framework can outperform other popular deep generative models in modeling natural images and learning from incomplete data. We further demonstrate the unsupervised disentanglement of hierarchical latent representation with controllable image synthesis.

Downloads

Published

2021-05-18

How to Cite

Sun, J., Liu, R., & Zhou, B. (2021). HiABP: Hierarchical Initialized ABP for Unsupervised Representation Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 9747-9755. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17172

Issue

Section

AAAI Technical Track on Machine Learning IV