Content Analysis by the Crowd: Assessing the Usability of Crowdsourcing for Coding Latent Constructs

Autor(en)
Fabienne Lind, Maria Gruber, Hajo Boomgaarden
Abstrakt

Crowdsourcing platforms are commonly used for research in the humanities, social sciences and informatics, including the use of crowdworkers to annotate textual material or visuals. Utilizing two empirical studies, this article systematically assesses the potential of crowdcoding for lessmanifest contents of news texts, here focusing on political actor evaluations. Specifically, Study 1 compares the reliability and validity of crowdcoded data to that of manual content analyses; Study 2 proceeds to investigate the effects of material presentation, different types of coding instructions and answer option formats on data quality. We find that the performance of the crowd recommends crowdcoded data as a reliable and valid alternative to manually coded data, also for less manifest contents. While scale manipulations affected the results, minor modifications of the coding instructions or material presentation did not significantly influence data quality. In sum, crowdcoding appears a robust instrument to collect quantitative content data.

Organisation(en)
Institut für Publizistik- und Kommunikationswissenschaft
Journal
Communication Methods & Measures
Band
11
Seiten
191-209
Anzahl der Seiten
19
DOI
https://doi.org/10.1080/19312458.2017.1317338
Publikationsdatum
05-2017
Peer-reviewed
Ja
ÖFOS 2012
508007 Kommunikationswissenschaft
Schlagwörter
ASJC Scopus Sachgebiete
Communication
Link zum Portal
https://ucris.univie.ac.at/portal/de/publications/content-analysis-by-the-crowd-assessing-the-usability-of-crowdsourcing-for-coding-latent-constructs(67e6cb94-a907-401b-99fe-e2d8d715241e).html