Goal-Setting Behavior of Workers on Crowdsourcing Platforms: An Exploratory Study on MTurk and Prolific
Keywords:Goal Setting, Goal Types, Goal Management Tools, Goal Achievement, Goal-Achievement Barriers, Crowdsourcing, Human Computation
AbstractA wealth of evidence across several domains indicates that goal setting improves performance and learning by enabling individuals to commit their thoughts and actions to goal achievement. Recently, researchers have begun studying the effects of goal setting in paid crowdsourcing to improve the quality and quantity of contributions, increase learning gains, and hold participants accountable for contributing more effectively. However, there is a lack of research addressing crowd workers' goal-setting practices, how they are currently pursuing them, and the challenges that they face. This information is essential for researchers and developers to create tools that assist crowd workers in pursuing their goals more effectively, thereby improving the quality of their contributions. This paper addresses these gaps by conducting mixed-method research in which we surveyed 205 workers from two crowdsourcing platforms -- Amazon Mechanical Turk (MTurk) and Prolific -- about their goal-setting practices. Through a 14-item survey, we asked workers regarding the types of goals they create, their goal achievement strategies, potential barriers that impede goal attainment, and their use of software tools for effective goal management. We discovered that (a) workers actively create intrinsic and extrinsic goals; (b) use a combination of tools for goal management; (c) medical issues and a busy lifestyle are some obstacles to their goal achievement; and (d) we gathered novel features for future goal management tools. Our findings shed light on the broader implications of developing goal management tools to improve workers' well-being.
How to Cite
Abbas, T., & Gadiraju, U. (2022). Goal-Setting Behavior of Workers on Crowdsourcing Platforms: An Exploratory Study on MTurk and Prolific. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 10(1), 2-13. https://doi.org/10.1609/hcomp.v10i1.21983
Full Archival Papers