RMIT University
Browse

Crowdsourcing Truthfulness: The Impact of Judgment Scale and Assessor Bias

conference contribution
posted on 2024-11-03, 13:02 authored by David Barbera, Kevin Roitero, Gianluca Demartini, Stefano Mizzaro, Damiano SpinaDamiano Spina
News content can sometimes be misleading and influence users’ decision making processes (e.g., voting decisions). Quantitatively assessing the truthfulness of content becomes key, but it is often challenging and thus done by experts. In this work we look at how experts and non-expert assess truthfulness of content by focusing on the effect of the adopted judgment scale and of assessors’ own bias on the judgments they perform. Our results indicate a clear effect of the assessors’ political background on their judgments where they tend to trust content which is aligned to their own belief, even if experts have marked it as false. Crowd assessors also seem to have a preference towards coarse-grained scales, as they tend to use a few extreme values rather than the full breadth of fine-grained scales.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1007/978-3-030-45442-5_26
  2. 2.
    ISBN - Is published in 9783030454418 (urn:isbn:9783030454418)

Start page

207

End page

214

Total pages

8

Outlet

Proceedings of the 42nd European Conference on IR Research (ECIR 2020)

Editors

Joemon M. Jose, Emine Yilmaz, João Magalhães, Pablo Castells, Nicola Ferro, Mário J. Silva, Flávio Martins

Name of conference

ECIR 2020: Part II- Lecture Notes in Computer Science 12036

Publisher

Springer Nature

Place published

Switzerland

Start date

2020-04-14

End date

2020-04-17

Language

English

Copyright

© Springer Nature Switzerland AG 2020

Former Identifier

2006098493

Esploro creation date

2020-06-22

Fedora creation date

2020-05-12

Usage metrics

    Scholarly Works

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC