A Student-Teacher Architecture for Dialog Domain Adaptation Under the Meta-Learning Setting
Keywords:Conversational AI/Dialog Systems
AbstractNumerous new dialog domains are being created every day while collecting data for these domains is extremely costly since it involves human interactions. Therefore, it is essential to develop algorithms that can adapt to different domains efficiently when building data-driven dialog models. Most recent research on domain adaption focuses on giving the model a better initialization, rather than optimizing the adaptation process. We propose an efficient domain adaptive task-oriented dialog system model, which incorporates a meta-teacher model to emphasize the different impacts between generated tokens with respect to the context. We first train our base dialog model and meta-teacher model adversarially in a meta-learning setting on rich-resource domains. The meta-teacher learns to quantify the importance of tokens under different contexts across different domains. During adaptation, the meta-teacher guides the dialog model to focus on important tokens in order to achieve better adaptation efficiency. We evaluate our model on two multi-domain datasets, MultiWOZ and Google Schema-Guided Dialogue, and achieve state-of-the-art performance.
How to Cite
Qian, K., Wei, W., & Yu, Z. (2021). A Student-Teacher Architecture for Dialog Domain Adaptation Under the Meta-Learning Setting. Proceedings of the AAAI Conference on Artificial Intelligence, 35(15), 13692-13700. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17614
AAAI Technical Track on Speech and Natural Language Processing II