Crowdsourcing from Scratch: A Pragmatic Experiment in Data Collection by Novice Requesters

Authors

  • Alexandra Papoutsaki Brown University
  • Hua Guo Brown University
  • Danae Metaxa-Kakavouli Brown University
  • Connor Gramazio Brown University
  • Jeff Rasley Brown University
  • Wenting Xie Brown University
  • Guan Wang Brown University
  • Jeff Huang Brown University

DOI:

https://doi.org/10.1609/hcomp.v3i1.13230

Keywords:

crowdsourcing, Amazon Mechanical Turk, novice requesters

Abstract

As crowdsourcing has gained prominence in recent years, an increasing number of people turn to popular crowdsourcing platforms for their many uses. Experienced members of the crowdsourcing community have developed numerous systems both separately and in conjunction with these platforms, along with other tools and design techniques, to gain more specialized functionality and overcome various shortcomings. It is unclear, however, how novice requesters using crowdsourcing platforms for general tasks experience existing platforms and how, if at all, their approaches deviate from the best practices established by the crowdsourcing research community. We conduct an experiment with a class of 19 students to study how novice requesters design crowdsourcing tasks. Each student tried their hand at crowdsourcing a real data collection task with a fixed budget and realistic time constraint. Students used Amazon Mechanical Turk to gather information about the academic careers of over 2,000 professors from 50 top Computer Science departments in the U.S. In addition to curating this dataset, we classify the strategies which emerged, discuss design choices students made on task dimensions, and compare these novice strategies to best practices identified in crowdsourcing literature. Finally, we summarize design pitfalls and effective strategies observed to provide guidelines for novice requesters.

Downloads

Published

2015-09-23

How to Cite

Papoutsaki, A., Guo, H., Metaxa-Kakavouli, D., Gramazio, C., Rasley, J., Xie, W., Wang, G., & Huang, J. (2015). Crowdsourcing from Scratch: A Pragmatic Experiment in Data Collection by Novice Requesters. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 3(1), 140-149. https://doi.org/10.1609/hcomp.v3i1.13230