RMIT University
Browse

Can the crowd judge truthfulness? A longitudinal study on recent misinformation about COVID-19

journal contribution
posted on 2024-11-02, 17:50 authored by Kevin Roitero, Michael Soprano, Beatrice Portelli, Massimiliano De Luise, Damiano SpinaDamiano Spina, Vincenzo Della Mea, Giuseppe Serra, Stefano Mizzaro, Gianluca Demartini
Recently, the misinformation problem has been addressed with a crowdsourcing-based approach: to assess the truthfulness of a statement, instead of relying on a few experts, a crowd of non-expert is exploited. We study whether crowdsourcing is an effective and reliable method to assess truthfulness during a pandemic, targeting statements related to COVID-19, thus addressing (mis)information that is both related to a sensitive and personal issue and very recent as compared to when the judgment is done. In our experiments, crowd workers are asked to assess the truthfulness of statements, and to provide evidence for the assessments. Besides showing that the crowd is able to accurately judge the truthfulness of the statements, we report results on workers’ behavior, agreement among workers, effect of aggregation functions, of scales transformations, and of workers background and bias. We perform a longitudinal study by re-launching the task multiple times with both novice and experienced workers, deriving important insights on how the behavior and quality change over time. Our results show that workers are able to detect and objectively categorize online (mis)information related to COVID-19; both crowdsourced and expert judgments can be transformed and aggregated to improve quality; worker background and other signals (e.g., source of information, behavior) impact the quality of the data. The longitudinal study demonstrates that the time-span has a major effect on the quality of the judgments, for both novice and experienced workers. Finally, we provide an extensive failure analysis of the statements misjudged by the crowd-workers.

Funding

Fair and Transparent Information Access in Spoken Conversational Assistants

Australian Research Council

Find out more...

Building crowd sourced data curation processes

Australian Research Council

Find out more...

History

Journal

Personal and Ubiquitous Computing

Volume

27

Issue

1

Start page

59

End page

89

Total pages

31

Publisher

Springer

Place published

United Kingdom

Language

English

Copyright

© The Author(s) 2021. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License

Former Identifier

2006110183

Esploro creation date

2023-03-01

Usage metrics

    Scholarly Works

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC