HiABP: Hierarchical Initialized ABP for Unsupervised Representation Learning
DOI:
https://doi.org/10.1609/aaai.v35i11.17172Keywords:
Representation Learning, Unsupervised & Self-Supervised Learning, ApplicationsAbstract
Although Markov chain Monte Carlo (MCMC) is useful for generating samples from the posterior distribution, it often suffers from intractability when dealing with large-scale datasets. To address this issue, we propose Hierarchical Initialized Alternating Back-propagation (HiABP) for efficient Bayesian inference. Especially, we endow Alternating Backpropagation (ABP) method with a well-designed initializer and hierarchical structure, composing the pipeline of Initializing, Improving, and Learning back-propagation. It saves much time for the generative model to initialize the latent variable by constraining a sampler to be close to the true posterior distribution. The initialized latent variable is then improved significantly by an MCMC sampler. Thus the proposed method has the strengths of both methods, i.e., the effectiveness of MCMC and the efficiency of variational inference. Experimental results validate our framework can outperform other popular deep generative models in modeling natural images and learning from incomplete data. We further demonstrate the unsupervised disentanglement of hierarchical latent representation with controllable image synthesis.Downloads
Published
2021-05-18
How to Cite
Sun, J., Liu, R., & Zhou, B. (2021). HiABP: Hierarchical Initialized ABP for Unsupervised Representation Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 9747-9755. https://doi.org/10.1609/aaai.v35i11.17172
Issue
Section
AAAI Technical Track on Machine Learning IV