A Crowdsourcing Method for Obtaining Rephrased Questions

Authors

  • Nobuyuki Shimizu Yahoo Japan Corporation
  • Atsuyuki Morishima University of Tsukuba
  • Ryota Hayashi University of Tsukuba

DOI:

https://doi.org/10.1609/hcomp.v3i1.13251

Keywords:

Crowdsourcing, human computation, paraphrase, question paraphrase

Abstract

We propose a method for obtaining and ranking paraphrased questions from crowds to be used as a part of instructions in microtask-based crowdsourcing. With our method, we are able to obtain questions that differ in expression yet have the same semantics with respect to the crowdsourcing task. This is done by generating tasks that give hints and elicit instructions from workers. We conducted experiments with data used for a real set of gold standard questions submitted to a commercial crowdsourcing platform and compared the results with those from a direct-rewrite method.

Downloads

Published

2015-09-23

How to Cite

Shimizu, N., Morishima, A., & Hayashi, R. (2015). A Crowdsourcing Method for Obtaining Rephrased Questions. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 3(1), 32-33. https://doi.org/10.1609/hcomp.v3i1.13251