Effective Slot Filling via Weakly-Supervised Dual-Model Learning

Authors

  • Jue Wang Zhejiang University
  • Ke Chen Zhejiang University
  • Lidan Shou Zhejiang University
  • Sai Wu Zhejiang University
  • Gang Chen Zhejiang University

DOI:

https://doi.org/10.1609/aaai.v35i16.17643

Keywords:

Conversational AI/Dialog Systems

Abstract

Slot filling is a challenging task in Spoken Language Understanding (SLU). Supervised methods usually require large amounts of annotation to maintain desirable performance. A solution to relieve the heavy dependency on labeled data is to employ bootstrapping, which leverages unlabeled data. However, bootstrapping is known to suffer from semantic drift. We argue that semantic drift can be tackled by exploiting the correlation between slot values (phrases) and their respective types. By using some particular weakly labeled data, namely the plain phrases included in sentences, we propose a weakly-supervised slot filling approach. Our approach trains two models, namely a classifier and a tagger, which can effectively learn from each other on the weakly labeled data. The experimental results demonstrate that our approach achieves better results than standard baselines on multiple datasets, especially in the low-resource setting.

Downloads

Published

2021-05-18

How to Cite

Wang, J., Chen, K., Shou, L., Wu, S., & Chen, G. (2021). Effective Slot Filling via Weakly-Supervised Dual-Model Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(16), 13952-13960. https://doi.org/10.1609/aaai.v35i16.17643

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing III