Modeling Dialogues with Hashcode Representations: A Nonparametric Approach

Authors

  • Sahil Garg USC ISI
  • Irina Rish U of Montreal
  • Guillermo Cecchi IBM Research
  • Palash Goyal USC ISI
  • Sarik Ghazarian USC ISI
  • Shuyang Gao USC ISI
  • Greg Ver Steeg USC ISI
  • Aram Galstyan USC ISI

DOI:

https://doi.org/10.1609/aaai.v34i04.5813

Abstract

We propose a novel dialogue modeling framework, the first-ever nonparametric kernel functions based approach for dialogue modeling, which learns hashcodes as text representations; unlike traditional deep learning models, it handles well relatively small datasets, while also scaling to large ones. We also derive a novel lower bound on mutual information, used as a model-selection criterion favoring representations with better alignment between the utterances of participants in a collaborative dialogue setting, as well as higher predictability of the generated responses. As demonstrated on three real-life datasets, including prominently psychotherapy sessions, the proposed approach significantly outperforms several state-of-art neural network based dialogue systems, both in terms of computational efficiency, reducing training time from days or weeks to hours, and the response quality, achieving an order of magnitude improvement over competitors in frequency of being chosen as the best model by human evaluators.

Downloads

Published

2020-04-03

How to Cite

Garg, S., Rish, I., Cecchi, G., Goyal, P., Ghazarian, S., Gao, S., Ver Steeg, G., & Galstyan, A. (2020). Modeling Dialogues with Hashcode Representations: A Nonparametric Approach. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 3970-3979. https://doi.org/10.1609/aaai.v34i04.5813

Issue

Section

AAAI Technical Track: Machine Learning