Volunteering Versus Work for Pay: Incentives and Tradeoffs in Crowdsourcing

Authors

  • Andrew Mao Harvard University
  • Ece Kamar Microsoft Research
  • Yiling Chen Harvard University
  • Eric Horvitz Microsoft Research
  • Megan Schwamb Academica Sinica
  • Chris Lintott University of Oxford
  • Arfon Smith Zooniverse

DOI:

https://doi.org/10.1609/hcomp.v1i1.13075

Keywords:

payments, incentives, experiment, citizen science, volunteer, Amazon Mechanical Turk

Abstract

Paid and volunteer crowd work have emerged as a means for harnessing human intelligence for performing diverse tasks. However, little is known about the relative performance of volunteer versus paid crowd work, and how financial incentives influence the quality and efficiency of output. We study the performance of volunteers as well as workers paid with different monetary schemes on a difficult real-world crowdsourcing task. We observe that performance by unpaid and paid workers can be compared in carefully designed tasks, that financial incentives can be used to trade quality for speed, and that the compensation system on Amazon Mechanical Turk creates particular indirect incentives for workers. Our methodology and results have implications for the ideal choice of financial incentives and motivates further study on how monetary incentives influence worker behavior in crowdsourcing.

Downloads

Published

2013-11-03

How to Cite

Mao, A., Kamar, E., Chen, Y., Horvitz, E., Schwamb, M., Lintott, C., & Smith, A. (2013). Volunteering Versus Work for Pay: Incentives and Tradeoffs in Crowdsourcing. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1(1), 94-102. https://doi.org/10.1609/hcomp.v1i1.13075