Crowdsourcing Transcription Beyond Mechanical Turk

Authors

  • Haofeng Zhou University of Texas at Austin
  • Denys Baskov University of Texas at Austin
  • Matthew Lease University of Texas at Austin

DOI:

https://doi.org/10.1609/hcomp.v1i1.13093

Keywords:

crowdsourcing, transcription

Abstract

While much work has studied crowdsourced transcription via Amazon’s Mechanical Turk, we are not familiar with any prior cross-platform analysis of crowdsourcing service providers for transcription. We present a qualitative and quantitative analysis of eight such providers: 1-888-Type-It-Up, 3Play Media, Transcription Hub, CastingWords, Rev, TranscribeMe, Quicktate, and SpeakerText. We also provide comparative evaluation vs. three transcribers from oDesk. Spontanteous speech used in our experiments is drawn from USC-SFI MALACH collection of oral history interviews. After informally evaluating pilot transcripts from all providers, our formal evaluation measures word error rate (WER) over 10-minute segments from six interviews transcribed by three service providers and the three oDesk transcribers. We report the WER obtained in each case, and more generally assess tradeoffs among the quality, cost, risk and effort of alternative crowd-based transcription options.

Downloads

Published

2013-11-03

How to Cite

Zhou, H., Baskov, D., & Lease, M. (2013). Crowdsourcing Transcription Beyond Mechanical Turk. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1(1), 9-16. https://doi.org/10.1609/hcomp.v1i1.13093

Issue

Section

Scaling Speech, Language Understanding and Dialogue through Crowdsourcing Workshop