Crowdsourcing Objective Answers to Subjective Questions Online


  • Ravi Iyer Ranker



crowdsourcing, wisdom of crowds


In this demonstration, we show how Ranker’s algorithms use diverse sampling, measurement, and algorithmic techniques to crowdsource answers to subjective questions in a real-world online environment where user behavior is difficult to control. Ranker receives approximately 8 million visitors each month, as of September 2013, and collects over 1.5 million monthly user opinions. Tradeoffs between computational complexity, projected user engagement, and accuracy are required in such an environment, and aggregating across diverse techniques allows us to mitigate the sizable errors specific to individual imperfect crowdsourcing methods. We will specifically show how relatively unstructured crowdsourcing can yield surprisingly accurate predictions of movie box-office revenue, celebrity mortality, and retail pizza topping sales.




How to Cite

Iyer, R. (2013). Crowdsourcing Objective Answers to Subjective Questions Online. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1(1), 93-94.