The ActiveCrowdToolkit: An Open-Source Tool for Benchmarking Active Learning Algorithms for Crowdsourcing Research

Authors

  • Matteo Venanzi University of Southampton
  • Oliver Parson University of Southampton
  • Alex Rogers University of Southampton
  • Nick Jennings University of Southampton

DOI:

https://doi.org/10.1609/hcomp.v3i1.13256

Keywords:

Crowdsourcing, Active learning, Benchmarking

Abstract

We present an open-source toolkit that allows the easy comparison of the performance of active learning methods over a series of datasets. The toolkit allows such strategies to be constructed by combining a judgement aggregation model, task selection method and worker selection method.The toolkit also provides a user interface which allows researchers to gain insight into worker performance and task classification at runtime.

Downloads

Published

2015-09-23

How to Cite

Venanzi, M., Parson, O., Rogers, A., & Jennings, N. (2015). The ActiveCrowdToolkit: An Open-Source Tool for Benchmarking Active Learning Algorithms for Crowdsourcing Research. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 3(1), 44-45. https://doi.org/10.1609/hcomp.v3i1.13256