Dynamic Compositionality in Recursive Neural Networks with Structure-Aware Tag Representations


  • Taeuk Kim Seoul National University
  • Jihun Choi Seoul National University
  • Daniel Edmiston University of Chicago
  • Sanghwan Bae Seoul National University
  • Sang-goo Lee Seoul National University




Most existing recursive neural network (RvNN) architectures utilize only the structure of parse trees, ignoring syntactic tags which are provided as by-products of parsing. We present a novel RvNN architecture that can provide dynamic compositionality by considering comprehensive syntactic information derived from both the structure and linguistic tags. Specifically, we introduce a structure-aware tag representation constructed by a separate tag-level tree-LSTM. With this, we can control the composition function of the existing wordlevel tree-LSTM by augmenting the representation as a supplementary input to the gate functions of the tree-LSTM. In extensive experiments, we show that models built upon the proposed architecture obtain superior or competitive performance on several sentence-level tasks such as sentiment analysis and natural language inference when compared against previous tree-structured models and other sophisticated neural models.




How to Cite

Kim, T., Choi, J., Edmiston, D., Bae, S., & Lee, S.- goo. (2019). Dynamic Compositionality in Recursive Neural Networks with Structure-Aware Tag Representations. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 6594-6601. https://doi.org/10.1609/aaai.v33i01.33016594



AAAI Technical Track: Natural Language Processing