End-to-End Quantum-like Language Models with Application to Question Answering

Authors

  • Peng Zhang Tianjin University
  • Jiabin Niu Tianjin University
  • Zhan Su Tianjin University
  • Benyou Wang Tencent
  • Liqun Ma Tianjin University
  • Dawei Song Tianjin University, China; The Open University, United Kingdom

Keywords:

Question Answering, Quantum-like Language Model, Neural Network

Abstract

Language Modeling (LM) is a fundamental research topic in a range of areas. Recently, inspired by quantum theory, a novel Quantum Language Model (QLM) has been proposed for Information Retrieval (IR). In this paper, we aim to broaden the theoretical and practical basis of QLM. We develop a Neural Network based Quantum-like Language Model (NNQLM) and apply it to Question Answering. Specifically, based on word embeddings, we design a new density matrix, which represents a sentence (e.g., a question or an answer) and encodes a mixture of semantic subspaces. Such a density matrix, together with a joint representation of the question and the answer, can be integrated into neural network architectures (e.g., 2-dimensional convolutional neural networks). Experiments on the TREC-QA and WIKIQA datasets have verified the effectiveness of our proposed models.

Downloads

Published

2018-04-27

How to Cite

Zhang, P., Niu, J., Su, Z., Wang, B., Ma, L., & Song, D. (2018). End-to-End Quantum-like Language Models with Application to Question Answering. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/11979