TextNAS: A Neural Architecture Search Space Tailored for Text Representation

Authors

  • Yujing Wang Microsoft Research Asia
  • Yaming Yang Microsoft Research Asia
  • Yiren Chen Peking University
  • Jing Bai Microsoft Research Asia
  • Ce Zhang ETH Zurich
  • Guinan Su University of Science and Technology of China
  • Xiaoyu Kou Peking University
  • Yunhai Tong Peking University
  • Mao Yang Microsoft Research Asia
  • Lidong Zhou Microsoft Research Asia

DOI:

https://doi.org/10.1609/aaai.v34i05.6462

Abstract

Learning text representation is crucial for text classification and other language related tasks. There are a diverse set of text representation networks in the literature, and how to find the optimal one is a non-trivial problem. Recently, the emerging Neural Architecture Search (NAS) techniques have demonstrated good potential to solve the problem. Nevertheless, most of the existing works of NAS focus on the search algorithms and pay little attention to the search space. In this paper, we argue that the search space is also an important human prior to the success of NAS in different applications. Thus, we propose a novel search space tailored for text representation. Through automatic search, the discovered network architecture outperforms state-of-the-art models on various public datasets on text classification and natural language inference tasks. Furthermore, some of the design principles found in the automatic network agree well with human intuition.

Downloads

Published

2020-04-03

How to Cite

Wang, Y., Yang, Y., Chen, Y., Bai, J., Zhang, C., Su, G., Kou, X., Tong, Y., Yang, M., & Zhou, L. (2020). TextNAS: A Neural Architecture Search Space Tailored for Text Representation. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9242-9249. https://doi.org/10.1609/aaai.v34i05.6462

Issue

Section

AAAI Technical Track: Natural Language Processing