Monetary Interventions in Crowdsourcing Task Switching

Authors

  • Ming Yin Harvard University
  • Yiling Chen Harvard University
  • Yu-An Sun Xerox Innovation Group

Abstract

With a large amount of tasks of various types, requesters in crowdsourcing platforms often bundle tasks of different types into a single working session. This creates a task switching setting, where workers need to shift between different cognitive tasks. We design and conduct an experiment on Amazon Mechanical Turk to study how occasionally presented performance-contingent monetary rewards, referred as monetary interventions, affect worker performance in the task switching setting. We use two competing metrics to evaluate worker performance. When monetary interventions are placed on some tasks in a working session, our results show that worker performance on these tasks can be improved in both metrics. Moreover, worker performance on other tasks where monetary interventions are not placed is also affected: workers perform better according to one metric, but worse according to the other metric. This suggests that in addition to providing extrinsic monetary incentives for some tasks, monetary interventions implicitly set performance goals for all tasks. Furthermore, monetary interventions are most effective in improving worker performance when used at switch tasks, tasks that follow a task of a different type, in working sessions with a low task switching frequency.

Downloads

Published

2014-09-05

How to Cite

Yin, M., Chen, Y., & Sun, Y.-A. (2014). Monetary Interventions in Crowdsourcing Task Switching. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 2(1). Retrieved from https://ojs.aaai.org/index.php/HCOMP/article/view/13160