HirePeer: Impartial Peer-Assessed Hiring at Scale in Expert Crowdsourcing Markets


  • Yasmine Kotturi Carnegie-Mellon University
  • Anson Kahng Carnegie-Mellon University
  • Ariel Procaccia Carnegie-Mellon University
  • Chinmay Kulkarni Carnegie-Mellon University




Expert crowdsourcing (e.g., Upwork.com) provides promising benefits such as productivity improvements for employers, and flexible working arrangements for workers. Yet to realize these benefits, a key persistent challenge is effective hiring at scale. Current approaches, such as reputation systems and standardized competency tests, develop weaknesses such as score inflation over time, thus degrading market quality. This paper presents HirePeer, a novel alternative approach to hiring at scale that leverages peer assessment to elicit honest assessments of fellow workers' job application materials, which it then aggregates using an impartial ranking algorithm. This paper reports on three studies that investigate both the costs and the benefits to workers and employers of impartial peer-assessed hiring. We find, to solicit honest assessments, algorithms must be communicated in terms of their impartial effects. Second, in practice, peer assessment is highly accurate, and impartial rank aggregation algorithms incur a small accuracy cost for their impartiality guarantee. Third, workers report finding peer-assessed hiring useful for receiving targeted feedback on their job materials.




How to Cite

Kotturi, Y., Kahng, A., Procaccia, A., & Kulkarni, C. (2020). HirePeer: Impartial Peer-Assessed Hiring at Scale in Expert Crowdsourcing Markets. Proceedings of the AAAI Conference on Artificial Intelligence, 34(03), 2577-2584. https://doi.org/10.1609/aaai.v34i03.5641



AAAI Technical Track: Human-Computation and Crowd Sourcing