Quantifying the Impact of Cognitive Biases in Question-Answering Systems

Authors

  • Keith Burghardt University of California, Davis
  • Tad Hogg Institute for Molecular Manufacturing
  • Kristina Lerman University of Southern California Information Sciences Institute

DOI:

https://doi.org/10.1609/icwsm.v12i1.15042

Keywords:

Crowdsourcing, Question-answering systems, Online Experiment, Big Data

Abstract

Crowdsourcing can identify high-quality solutions to problems; however, individual decisions are constrained by cognitive biases. We investigate some of these biases in an experimental model of a question-answering system. We observe a strong position bias in favor of answers appearing earlier in a list of choices. This effect is enhanced by three cognitive factors: the attention an answer receives, its perceived popularity, and cognitive load, measured by the number of choices a user has to process. While separately weak, these effects synergistically amplify position bias and decouple user choices of best answers from their intrinsic quality. We end our paper by discussing the novel ways we can apply these findings to substantially improve how high-quality answers are found in question-answering systems.

Downloads

Published

2018-06-15

How to Cite

Burghardt, K., Hogg, T., & Lerman, K. (2018). Quantifying the Impact of Cognitive Biases in Question-Answering Systems. Proceedings of the International AAAI Conference on Web and Social Media, 12(1). https://doi.org/10.1609/icwsm.v12i1.15042