Neural Language Model Based Attentive Term Dependence Model for Verbose Query (Student Abstract)

Authors

  • Dipannita Podder Indian Institute of Technology Kharagpur
  • Jiaul H. Paik Indian Institute of Technology Kharagpur
  • Pabitra Mitra Indian Institute of Technology Kharagpur

DOI:

https://doi.org/10.1609/aaai.v37i13.27010

Keywords:

Term Dependence Model, Verbose Query, Neural Language Model

Abstract

The query-document term matching plays an important role in information retrieval. However, the retrieval performance degrades when the documents get matched with the extraneous terms of the query which frequently arises in verbose queries. To address this problem, we generate the dense vector of the entire query and individual query terms using the pre-trained BERT (Bidirectional Encoder Representations from Transformers) model and subsequently analyze their relation to focus on the central terms. We then propose a context-aware attentive extension of unsupervised Markov Random Field-based sequential term dependence model that explicitly pays more attention to those contextually central terms. The proposed model utilizes the strengths of the pre-trained large language model for estimating the attention weight of terms and rank the documents in a single pass without any supervision.

Downloads

Published

2024-07-15

How to Cite

Podder, D., Paik, J. H., & Mitra, P. (2024). Neural Language Model Based Attentive Term Dependence Model for Verbose Query (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 37(13), 16300-16301. https://doi.org/10.1609/aaai.v37i13.27010