HiABP: Hierarchical Initialized ABP for Unsupervised Representation Learning
Keywords:Representation Learning, Unsupervised & Self-Supervised Learning, Applications
AbstractAlthough Markov chain Monte Carlo (MCMC) is useful for generating samples from the posterior distribution, it often suffers from intractability when dealing with large-scale datasets. To address this issue, we propose Hierarchical Initialized Alternating Back-propagation (HiABP) for efficient Bayesian inference. Especially, we endow Alternating Backpropagation (ABP) method with a well-designed initializer and hierarchical structure, composing the pipeline of Initializing, Improving, and Learning back-propagation. It saves much time for the generative model to initialize the latent variable by constraining a sampler to be close to the true posterior distribution. The initialized latent variable is then improved significantly by an MCMC sampler. Thus the proposed method has the strengths of both methods, i.e., the effectiveness of MCMC and the efficiency of variational inference. Experimental results validate our framework can outperform other popular deep generative models in modeling natural images and learning from incomplete data. We further demonstrate the unsupervised disentanglement of hierarchical latent representation with controllable image synthesis.
How to Cite
Sun, J., Liu, R., & Zhou, B. (2021). HiABP: Hierarchical Initialized ABP for Unsupervised Representation Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 9747-9755. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17172
AAAI Technical Track on Machine Learning IV