Neural Semantic Parsing in Low-Resource Settings with Back-Translation and Meta-Learning

Authors

  • Yibo Sun Harbin Institute of Technology
  • Duyu Tang Microsoft Research Asia
  • Nan Duan Microsoft Research Asia
  • Yeyun Gong Microsoft Research Asia
  • Xiaocheng Feng Harbin Institute of Technology
  • Bing Qin Harbin Institute of Technology
  • Daxin Jiang Microsoft Search Technology Center Asia

DOI:

https://doi.org/10.1609/aaai.v34i05.6427

Abstract

Neural semantic parsing has achieved impressive results in recent years, yet its success relies on the availability of large amounts of supervised data. Our goal is to learn a neural semantic parser when only prior knowledge about a limited number of simple rules is available, without access to either annotated programs or execution results. Our approach is initialized by rules, and improved in a back-translation paradigm using generated question-program pairs from the semantic parser and the question generator. A phrase table with frequent mapping patterns is automatically derived, also updated as training progresses, to measure the quality of generated instances. We train the model with model-agnostic meta-learning to guarantee the accuracy and stability on examples covered by rules, and meanwhile acquire the versatility to generalize well on examples uncovered by rules. Results on three benchmark datasets with different domains and programs show that our approach incrementally improves the accuracy. On WikiSQL, our best model is comparable to the state-of-the-art system learned from denotations.

Downloads

Published

2020-04-03

How to Cite

Sun, Y., Tang, D., Duan, N., Gong, Y., Feng, X., Qin, B., & Jiang, D. (2020). Neural Semantic Parsing in Low-Resource Settings with Back-Translation and Meta-Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8960-8967. https://doi.org/10.1609/aaai.v34i05.6427

Issue

Section

AAAI Technical Track: Natural Language Processing