Learning Concept Embeddings for Query Expansion by Quantum Entropy Minimization

Authors

  • Alessandro Sordoni Université de Montréal
  • Yoshua Bengio Université de Montréal
  • Jian-Yun Nie Université de Montréal

DOI:

https://doi.org/10.1609/aaai.v28i1.8933

Keywords:

query expansion, word embeddings

Abstract

In web search, users queries are formulated using only few terms and term-matching retrieval functions could fail at retrieving relevant documents. Given a user query, the technique of query expansion (QE) consists in selecting related terms that could enhance the likelihood of retrieving relevant documents. Selecting such expansion terms is challenging and requires a computational framework capable of encoding complex semantic relationships. In this paper, we propose a novel method for learning, in a supervised way, semantic representations for words and phrases. By embedding queries and documents in special matrices, our model disposes of an increased representational power with respect to existing approaches adopting a vector representation. We show that our model produces high-quality query expansion terms. Our expansion increase IR measures beyond expansion from current word-embeddings models and well-established traditional QE methods.

Downloads

Published

2014-06-21

How to Cite

Sordoni, A., Bengio, Y., & Nie, J.-Y. (2014). Learning Concept Embeddings for Query Expansion by Quantum Entropy Minimization. Proceedings of the AAAI Conference on Artificial Intelligence, 28(1). https://doi.org/10.1609/aaai.v28i1.8933