A Novice-Reviewer Experiment to Address Scarcity of Qualified Reviewers in Large Conferences
Keywords:AI for Conference Organization and Delivery (AICOD), Analysis of Human Computation (Limitations, Optim
AbstractConference peer review constitutes a human-computation process whose importance cannot be overstated: not only it identifies the best submissions for acceptance, but, ultimately, it impacts the future of the whole research area by promoting some ideas and restraining others. A surge in the number of submissions received by leading AI conferences has challenged the sustainability of the review process by increasing the burden on the pool of qualified reviewers which is growing at a much slower rate. In this work, we consider the problem of reviewer recruiting with a focus on the scarcity of qualified reviewers in large conferences. Specifically, we design a procedure for (i) recruiting reviewers from the population not typically covered by major conferences and (ii) guiding them through the reviewing pipeline. In conjunction with the ICML 2020 --- a large, top-tier machine learning conference --- we recruit a small set of reviewers through our procedure and compare their performance with the general population of ICML reviewers. Our experiment reveals that a combination of the recruiting and guiding mechanisms allows for a principled enhancement of the reviewer pool and results in reviews of superior quality compared to the conventional pool of reviews as evaluated by senior members of the program committee (meta-reviewers).
How to Cite
Stelmakh, I., Shah, N. B., Singh, A., & Daumé III, H. (2021). A Novice-Reviewer Experiment to Address Scarcity of Qualified Reviewers in Large Conferences. Proceedings of the AAAI Conference on Artificial Intelligence, 35(6), 4785-4793. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/16610
AAAI Technical Track Focus Area on AI for Conference Organization and Delivery