The ActiveCrowdToolkit: An Open-Source Tool for Benchmarking Active Learning Algorithms for Crowdsourcing Research
DOI:
https://doi.org/10.1609/hcomp.v3i1.13256Keywords:
Crowdsourcing, Active learning, BenchmarkingAbstract
We present an open-source toolkit that allows the easy comparison of the performance of active learning methods over a series of datasets. The toolkit allows such strategies to be constructed by combining a judgement aggregation model, task selection method and worker selection method.The toolkit also provides a user interface which allows researchers to gain insight into worker performance and task classification at runtime.
Downloads
Published
2015-09-23
How to Cite
Venanzi, M., Parson, O., Rogers, A., & Jennings, N. (2015). The ActiveCrowdToolkit: An Open-Source Tool for Benchmarking Active Learning Algorithms for Crowdsourcing Research. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 3(1), 44-45. https://doi.org/10.1609/hcomp.v3i1.13256
Issue
Section
Works in Progress