Improving Tree-Structured Decoder Training for Code Generation via Mutual Learning

Authors

  • Binbin Xie Xiamen University Peng Cheng Laboratory
  • Jinsong Su Xiamen University Peng Cheng Laboratory
  • Yubin Ge University of Illinois at Urbana-Champaign
  • Xiang Li Xiaomi AI Lab
  • Jianwei Cui Xiaomi AI Lab
  • Junfeng Yao Xiamen University
  • Bin Wang Xiaomi AI Lab

DOI:

https://doi.org/10.1609/aaai.v35i16.17662

Keywords:

Generation

Abstract

Code generation aims to automatically generate a piece of code given an input natural language utterance. Currently, among dominant models, it is treated as a sequence-to-tree task, where a decoder outputs a sequence of actions corresponding to the pre-order traversal of an Abstract Syntax Tree. However, such a decoder only exploits the pre-order traversal based preceding actions, which are insufficient to ensure correct action predictions. In this paper, we first throughly analyze the context modeling difference between neural code generation models with different traversals based decodings (preorder traversal vs breadth-first traversal), and then propose to introduce a mutual learning framework to jointly train these models. Under this framework, we continuously enhance both two models via mutual distillation, which involves synchronous executions of two one-to-one knowledge transfers at each training step. More specifically, we alternately choose one model as the student and the other as its teacher, and require the student to fit the training data and the action prediction distributions of its teacher. By doing so, both models can fully absorb the knowledge from each other and thus could be improved simultaneously. Experimental results and in-depth analysis on several benchmark datasets demonstrate the effectiveness of our approach. We release our code at https://github.com/DeepLearnXMU/CGML.

Downloads

Published

2021-05-18

How to Cite

Xie, B., Su, J., Ge, Y., Li, X., Cui, J., Yao, J., & Wang, B. (2021). Improving Tree-Structured Decoder Training for Code Generation via Mutual Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(16), 14121-14128. https://doi.org/10.1609/aaai.v35i16.17662

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing III