Infusing Human Factors into Algorithmic Crowdsourcing

Authors

  • Han Yu Nanyang Technological University
  • Chunyan Miao Nanyang Technological University
  • Zhiqi Shen Nanyang Technological University
  • Jun Lin Nanyang Technological University
  • Cyril Leung University of British Columbia
  • Qiang Yang Hong Kong University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v30i2.19073

Abstract

Algorithmic Crowdsourcing (AC) is an emerging field in which computational methods are proposed to automate cer- tain aspects of crowdsourcing. A number of AC methods have proposed recently in an attempt to address this problem. However, existing AC approaches are based on highly simplified models of worker behaviour which limit their practical applicability. To make efficient utilization of human resources for crowdsourcing tasks, the following tech- nical challenges remain open: Fairness of the solution, temporal changes in behaviour, optimizing wellbeing, and non-compliance by users.  For AI researchers to propose effective solutions to these challenges, labelled datasets reflecting various aspects of human decision-making related to task allocation in crowd-sourcing are needed. We construct an anonymized dataset based on player behavior trajectories captured by a multiagent game platform - Agile Manage. It allows players to demonstrate their task delegation strategies under different scenarios based on key characteristics involved in crowdsourcing task allocation. The game adopts implicit human computation  in which players contribute data which are valuable for research through informal games.

Downloads

Published

2016-02-18

How to Cite

Yu, H., Miao, C., Shen, Z., Lin, J., Leung, C., & Yang, Q. (2016). Infusing Human Factors into Algorithmic Crowdsourcing. Proceedings of the AAAI Conference on Artificial Intelligence, 30(2), 4062-4063. https://doi.org/10.1609/aaai.v30i2.19073