Design Guidelines for Crowdsourcing Emotion Annotations [Archived]

November 17, 2015
Duration: One semester
Goals: Develop guidelines for designing a crowdsourcing task for textual emotion annotation.
Responsible TA: Valentina Sintsova
Student Name: Not assigned yet
Keywords: Emotion Annotation, Crowdsourcing, Design Guidelines, Design Recommendations, Task Design
Abstract:

Annotation of emotions in collected text documents is an essential step for developing the reliable emotion classification system. It can serve as the ground-truth for training and/or testing emotion recognition models. Current crowdsourcing approaches involve annotation of text documents for emotions and emotion indicators. Yet, subjectivity in understanding of the emotion concept and lack of clear instructions may result in high inconsistency of collected data. There is a need to elaborate the appropriate design guidelines in order to ensure the collection high-quality annotations.

The responsible student will:

  • Survey the existent approaches to crowdsourcing sentiment and emotion annotation of text, as well as the method of their qualitative and quantitative evaluation
  • Analyze data collected from the previous crowdsourcing experiments on emotion annotation
  • Based on that analysis, derive design guidelines for crowdsourcing tasks on textual emotion annotation
Related Skills: Strong analytical skills; Statistical analysis; Interest in Crowdsourcing, Qualitative Research, and Human-Computer Interaction Design
Suitable for: Bachelor or Master Student