EFANet: Exchangeable Feature Alignment Network for Arbitrary Style Transfer

Authors

  • Zhijie Wu Shenzhen University
  • Chunjin Song Shenzhen University
  • Yang Zhou Shenzhen University
  • Minglun Gong University of Guelph
  • Hui Huang Shenzhen University

DOI:

https://doi.org/10.1609/aaai.v34i07.6914

Abstract

Style transfer has been an important topic both in computer vision and graphics. Since the seminal work of Gatys et al. first demonstrates the power of stylization through optimization in the deep feature space, quite a few approaches have achieved real-time arbitrary style transfer with straightforward statistic matching techniques. In this work, our key observation is that only considering features in the input style image for the global deep feature statistic matching or local patch swap may not always ensure a satisfactory style transfer; see e.g., Figure 1. Instead, we propose a novel transfer framework, EFANet, that aims to jointly analyze and better align exchangeable features extracted from the content and style image pair. In this way, the style feature from the style image seeks for the best compatibility with the content information in the content image, leading to more structured stylization results. In addition, a new whitening loss is developed for purifying the computed content features and better fusion with styles in feature space. Qualitative and quantitative experiments demonstrate the advantages of our approach.

Downloads

Published

2020-04-03

How to Cite

Wu, Z., Song, C., Zhou, Y., Gong, M., & Huang, H. (2020). EFANet: Exchangeable Feature Alignment Network for Arbitrary Style Transfer. Proceedings of the AAAI Conference on Artificial Intelligence, 34(07), 12305-12312. https://doi.org/10.1609/aaai.v34i07.6914

Issue

Section

AAAI Technical Track: Vision