In-HIT Example-Guided Annotation Aid for Crowdsourcing UI Components

Authors

  • Yi-Ching Huang National Taiwan University
  • Chun-I Wang National Taiwan University
  • Shih-Yuan Yu National Taiwan University
  • Yung-jen Hsu National Taiwan University

DOI:

https://doi.org/10.1609/hcomp.v1i1.13052

Keywords:

Crowdsourcing, Mechanical Turk, UI Component Annotation

Abstract

This paper presents an approach to crowdsourcing annotations of UI components from images. Using the “Find-Draw-Verify” task design, an in-HIT example-guided annotation aid is proposed to facilitate workers thereby improving the result quality.

Downloads

Published

2013-11-03

How to Cite

Huang, Y.-C., Wang, C.-I., Yu, S.-Y., & Hsu, Y.- jen. (2013). In-HIT Example-Guided Annotation Aid for Crowdsourcing UI Components. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1(1), 91-92. https://doi.org/10.1609/hcomp.v1i1.13052