Crowdsourcing Objective Answers to Subjective Questions Online

Authors

  • Ravi Iyer Ranker

DOI:

https://doi.org/10.1609/hcomp.v1i1.13053

Keywords:

crowdsourcing, wisdom of crowds

Abstract

In this demonstration, we show how Ranker’s algorithms use diverse sampling, measurement, and algorithmic techniques to crowdsource answers to subjective questions in a real-world online environment where user behavior is difficult to control. Ranker receives approximately 8 million visitors each month, as of September 2013, and collects over 1.5 million monthly user opinions. Tradeoffs between computational complexity, projected user engagement, and accuracy are required in such an environment, and aggregating across diverse techniques allows us to mitigate the sizable errors specific to individual imperfect crowdsourcing methods. We will specifically show how relatively unstructured crowdsourcing can yield surprisingly accurate predictions of movie box-office revenue, celebrity mortality, and retail pizza topping sales.

Downloads

Published

2013-11-03

How to Cite

Iyer, R. (2013). Crowdsourcing Objective Answers to Subjective Questions Online. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1(1), 93-94. https://doi.org/10.1609/hcomp.v1i1.13053