CrowdUtility: A Recommendation System for Crowdsourcing Platforms

Authors

  • Deepthi Chander Xerox Research Center India
  • Sakyajit Bhattacharya Xerox Research Centre India
  • Elisa Celis EPFL Lausanne
  • Koustuv Dasgupta Xerox Research Centre India
  • Saraschandra Karanam Xerox Research Centre India
  • Vaibhav Rajan Xerox Research Centre India
  • Avantika Gupta Xerox Research Centre India

Keywords:

Crowdsourcing, Platform Recommendation, QoS

Abstract

Crowd workers exhibit varying work patterns, expertise, and quality leading to wide variability in the performance of crowdsourcing platforms. The onus of choosing a suitable platform to post tasks is mostly with the requester, often leading to poor guarantees and unmet requirements due to the dynamism in performance of crowd platforms. Towards this end, we demonstrate CrowdUtility, a statistical modelling based tool for evaluating multiple crowdsourcing platforms and recommending a platform that best suits the requirements of the requester. CrowdUtility uses an online Multi-Armed Bandit framework, to schedule tasks while optimizing platform performance. We demonstrate an end-to end system starting from requirements specification, to platform recommendation, to real-time monitoring.

Downloads

Published

2014-09-05

How to Cite

Chander, D., Bhattacharya, S., Celis, E., Dasgupta, K., Karanam, S., Rajan, V., & Gupta, A. (2014). CrowdUtility: A Recommendation System for Crowdsourcing Platforms. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 2(1). Retrieved from https://ojs.aaai.org/index.php/HCOMP/article/view/13138