Learning Cycle-Consistent Cooperative Networks via Alternating MCMC Teaching for Unsupervised Cross-Domain Translation

Authors

  • Jianwen Xie Baidu Research
  • Zilong Zheng University of California, Los Angeles
  • Xiaolin Fang Massachusetts Institute of Technology
  • Song-Chun Zhu University of California, Los Angeles Tsinghua University Peking University
  • Ying Nian Wu University of California, Los Angeles

DOI:

https://doi.org/10.1609/aaai.v35i12.17249

Keywords:

Representation Learning, Computational Photography, Image & Video Synthesis, Unsupervised & Self-Supervised Learning, Transfer/Adaptation/Multi-task/Meta/Automated Learning

Abstract

This paper studies the unsupervised cross-domain translation problem by proposing a generative framework, in which the probability distribution of each domain is represented by a generative cooperative network that consists of an energy-based model and a latent variable model. The use of generative cooperative network enables maximum likelihood learning of the domain model by MCMC teaching, where the energy-based model seeks to fit the data distribution of domain and distills its knowledge to the latent variable model via MCMC. Specifically, in the MCMC teaching process, the latent variable model parameterized by an encoder-decoder maps examples from the source domain to the target domain, while the energy-based model further refines the mapped results by Langevin revision such that the revised results match to the examples in the target domain in terms of the statistical properties, which are defined by the learned energy function. For the purpose of building up a correspondence between two unpaired domains, the proposed framework simultaneously learns a pair of cooperative networks with cycle consistency, accounting for a two-way translation between two domains, by alternating MCMC teaching. Experiments show that the proposed framework is useful for unsupervised image-to-image translation and unpaired image sequence translation.

Downloads

Published

2021-05-18

How to Cite

Xie, J., Zheng, Z., Fang, X., Zhu, S.-C., & Wu, Y. N. (2021). Learning Cycle-Consistent Cooperative Networks via Alternating MCMC Teaching for Unsupervised Cross-Domain Translation. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10430-10440. https://doi.org/10.1609/aaai.v35i12.17249

Issue

Section

AAAI Technical Track on Machine Learning V