TET-GAN: Text Effects Transfer via Stylization and Destylization


  • Shuai Yang Peking University
  • Jiaying Liu Peking University
  • Wenjing Wang Peking University
  • Zongming Guo Peking University




Text effects transfer technology automatically makes the text dramatically more impressive. However, previous style transfer methods either study the model for general style, which cannot handle the highly-structured text effects along the glyph, or require manual design of subtle matching criteria for text effects. In this paper, we focus on the use of the powerful representation abilities of deep neural features for text effects transfer. For this purpose, we propose a novel Texture Effects Transfer GAN (TET-GAN), which consists of a stylization subnetwork and a destylization subnetwork. The key idea is to train our network to accomplish both the objective of style transfer and style removal, so that it can learn to disentangle and recombine the content and style features of text effects images. To support the training of our network, we propose a new text effects dataset with as much as 64 professionally designed styles on 837 characters. We show that the disentangled feature representations enable us to transfer or remove all these styles on arbitrary glyphs using one network. Furthermore, the flexible network design empowers TET-GAN to efficiently extend to a new text style via oneshot learning where only one example is required. We demonstrate the superiority of the proposed method in generating high-quality stylized text over the state-of-the-art methods.




How to Cite

Yang, S., Liu, J., Wang, W., & Guo, Z. (2019). TET-GAN: Text Effects Transfer via Stylization and Destylization. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 1238-1245. https://doi.org/10.1609/aaai.v33i01.33011238



AAAI Technical Track: Applications