Drafty: Enlisting Users To Be Editors Who Maintain Structured Data

Authors

  • Shaun Wallace Brown University
  • Lucy Van Kleunen Brown University
  • Marianne Aubin-Le Quere Brown University
  • Abraham Peterkin Brown University
  • Yirui Huang University of Toronto
  • Jeff Huang Brown University

DOI:

https://doi.org/10.1609/hcomp.v5i1.13300

Keywords:

crowdsourcing, peer production, database systems, collective intelligence

Abstract

Structured datasets are difficult to keep up-to-date since the underlying facts evolve over time; curated data about business financials, organizational hierarchies, or drug interactions are constantly changing. Drafty is a platform that enlists visitors of an editable dataset to become ``user-editors'' to help solve this problem. It records and analyzes user-editors' within-page interactions to construct user interest profiles, creating a cyclical feedback mechanism that enables Drafty to target requests for specific corrections from user-editors. To validate the automatically generated user interest profiles, we surveyed participants who performed self-created tasks with Drafty and found their user interest score was 3.2 higher on data they were interested in versus data they had no interest in. Next, a 7-month live experiment compared the efficacy of user-editor corrections depending on whether they were asked to review data that matched their interests. Our findings suggest that user-editors are approximately 3 times more likely to provide accurate corrections for data matching their interest profiles, and about 2 times more likely to provide corrections in the first place.

Downloads

Published

2017-09-21

How to Cite

Wallace, S., Van Kleunen, L., Aubin-Le Quere, M., Peterkin, A., Huang, Y., & Huang, J. (2017). Drafty: Enlisting Users To Be Editors Who Maintain Structured Data. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 5(1), 187-196. https://doi.org/10.1609/hcomp.v5i1.13300