In-HIT Example-Guided Annotation Aid for Crowdsourcing UI Components
DOI:
https://doi.org/10.1609/hcomp.v1i1.13052Keywords:
Crowdsourcing, Mechanical Turk, UI Component AnnotationAbstract
This paper presents an approach to crowdsourcing annotations of UI components from images. Using the “Find-Draw-Verify” task design, an in-HIT example-guided annotation aid is proposed to facilitate workers thereby improving the result quality.
Downloads
Published
2013-11-03
How to Cite
Huang, Y.-C., Wang, C.-I., Yu, S.-Y., & Hsu, Y.- jen. (2013). In-HIT Example-Guided Annotation Aid for Crowdsourcing UI Components. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1(1), 91-92. https://doi.org/10.1609/hcomp.v1i1.13052
Issue
Section
Demonstrations