Learning Flexible Latent Representations via Encapsulated Variational Encoders

Authors

  • Wenjun Bai Kobe University
  • Changqin Quan Kobe University
  • Zhi-Wei Luo Kobe University

DOI:

https://doi.org/10.1609/aaai.v33i01.33019913

Abstract

Learning flexible latent representation of observed data is an important precursor for most downstream AI applications. To this end, we propose a novel form of variational encoder, i.e., encapsulated variational encoders (EVE) to exert direct control over encoded latent representations along with its learning algorithm, i.e., the EVE compatible automatic variational differentiation inference algorithm. Armed with this property, our derived EVE is capable of learning converged and diverged latent representations. Using CIFAR-10 as an example, we show that the learning of converged latent representations brings a considerable improvement on the discriminative performance of the semi-supervised EVE. Using MNIST as a demonstration, the generative modelling performance of the EVE induced variational auto-encoder (EVAE) can be largely enhanced with the help of learned diverged latent representations.

Downloads

Published

2019-07-17

How to Cite

Bai, W., Quan, C., & Luo, Z.-W. (2019). Learning Flexible Latent Representations via Encapsulated Variational Encoders. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 9913-9914. https://doi.org/10.1609/aaai.v33i01.33019913

Issue

Section

Student Abstract Track