RMIT University
Browse

Can The Crowd Identify Misinformation Objectively?: The Effects of Judgment Scale and Assessor's Background

conference contribution
posted on 2024-11-03, 12:54 authored by Kevin Roitero, Michael Soprano, Shaoyang Fan, Damiano SpinaDamiano Spina, Stefano Mizzaro, Gianluca Demartini
Truthfulness judgments are a fundamental step in the process of fighting misinformation, as they are crucial to train and evaluate classifiers that automatically distinguish true and false statements. Usually such judgments are made by experts, like journalists for political statements or medical doctors for medical statements. In this paper, we follow a different approach and rely on (non-expert) crowd workers. This of course leads to the following research question: Can crowdsourcing be reliably used to assess the truthfulness of information and to create large-scale labeled collections for information credibility systems? To address this issue, we present the results of an extensive study based on crowdsourcing: we collect thousands of truthfulness assessments over two datasets, and we compare expert judgments with crowd judgments, expressed on scales with various granularity levels. We also measure the political bias and the cognitive background of the workers, and quantify their effect on the reliability of the data provided by the crowd.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1145/3397271.3401112

Start page

439

End page

448

Total pages

10

Outlet

Proceedings of the 43rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2020)

Name of conference

SIGIR 2020

Publisher

Association for Computing Machinery

Place published

New York, United States

Start date

2020-07-25

End date

2020-07-30

Language

English

Copyright

© 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM.

Former Identifier

2006100886

Esploro creation date

2020-09-08

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC