Online labor platforms, such as the Amazon Mechanical Turk, provide an effective framework for eliciting responses to judgment tasks. Previous work has shown that workers respond best to financial incentives, especially to extra bonuses. However, most of the tested incentives involve describing the bonus conditions in formulas instead of plain English. We believe that different incentives given in English (or in qualitative framing) will result in differences in workers’ performance, especially when task difficulties vary. In this paper, we report the preliminary results of a crowdsourcing experiment comparing workers’ performance using only qualitative framings of financial incentives. Our results demonstrate a significant increase in workers performance using a specific well-formulated qualitative framing inspired by the Peer Truth Serum. This positive effect is observed only when the difficulty of the task is high, while when the task is easy there is no difference of which incentives to use.
Human Computation, Emotion Annotation, Financial Incentives, Crowdsourcing, Framing Incentives, Bonus Formulations
- Sephora Madjiheurem, Valentina Sintsova, and Pearl Pu. Qualitative Framing of Financial Incentives- A Case of Emotion Annotation. In Work-In-Progress Track of Human Computation Conference (HCOMP), 2016
Short paper, [access]
- Sephora Madjiheurem, Valentina Sintsova, and Pearl Pu. Qualitative Framing of Financial Incentives- A Case of Emotion Annotation (Full Report). Unpublished report, 2016.
Full report, [access]
- Poster for HCOMP-2016, [pdf]