Realtime user ratings as a strategy for combatting misinformation: an experimental study

Project info

Work package
  • Synthesis
Sustainability threat
  • Feedback Cycles
Challenge
  • Dealing with diversity

Study info

Description of Study
Because fact-checking takes time, verdicts are usually reached after a message has gone viral and interventions can have only limited effect. A new approach recently proposed in scholarship and piloted on online platforms is to harness the wisdom of the crowd by enabling recipients of an online message to attach veracity assessments to it. The intention is to allow poor initial crowd reception to temper belief in and further spread of misinformation. We study this approach by letting 4000 subjects in 80 experimental bipartisan communities sequentially rate the veracity of informational messages. We find that in well-mixed communities, the public display of earlier veracity ratings indeed enhances the correct classification of true and false messages by subsequent users. However, crowd intelligence backfires when false information is sequentially rated in ideologically segregated communities. This happens because early raters’ ideological bias, which is aligned with a message, influences later raters’ assessments away from the truth. These results suggest that network segregation poses an important problem for community misinformation detection systems that must be accounted for in the design of such systems.
Study research question
How do real-time veracity ratings hinder of amplify individuals' ability to correctly identify true over false information in different network types?
Collection provenance
  • Collected during project
Collection methods
  • Experiment
Personal data
No
External Source
Source description
File formats
  • csv
Data types
  • Structured
Languages
  • English
Coverage start
Coverage end
01/07/2021
31/12/2021
Spatial coverage
United States
Collection period start
01/07/2021
Collection period end
31/12/2021

Variables

Unit
Unit description
Sample size
Sampling method
Individuals
self-identified US conservatives and liberals
2000
Recruited on Amazon Mechanical Turk
Individuals
self-identified US conservatives and liberal
2000
Recruited on Prolific
Hypothesis
Theory
H1: When it is not too difficult to classify a message correctly (d < 0.5), then individuals in integrated groups (with information about previous rating choices) classify true and false messages more often correctly than individuals in independent groups (without information about previous rating choices).
Wisdom of crowds under social influence
H2: When it is not too difficult for ideologically aligned individuals to classify a message correctly (dalign < 0.5), then individuals in segregated groups (with information about previous rating choices) classify true messages more often correctly than individuals in independent groups (without information about previous rating choices).
Wisdom of crowds under social influence
H3: When it is difficult for ideologically aligned individuals to classify a message correctly (dalign > 0.5), then individuals in segregated groups (with information about previous rating choices) classify false messages less often correctly than individuals in independent groups (without information about previous rating choices).
Wisdom of crowds under social influence
Variable type
Variable name
Variable description
Dependent variable
fraction_correct
fractions of correct ratings in a sequence
Independent variable
social_influence
whether sequence allows for social influence (yes / no)
Independent variable
network_type
whether rating sequence is ideologically integrated or segregated
Discipline-specific operationalizations
Conflict of interest
Nothing to declare

Data packages

Realtime User Ratings as a Strategy for Combatting Misinformation: An Experimental Study

Data package DOI
10.17605/OSF.IO/P5BYQ
Description
Replication Package
Accessibility
Open Access
Repository
OSF
User license
cc 2.0
Retention period
10

Publications

Realtime user ratings as a strategy for combatting misinformation: an experimental study

Stein, J., Frey, V., & van de Rijt, A. (2023). Realtime user ratings as a strategy for combatting misinformation: an experimental study. Scientific reports, 13(1), 1626.

Documents

Filename
Description
Date

Ethics

Ethical assessment
Yes
Ethical committee
Ethical committee of the European University Institute, Florence