RMIT University
Browse

On crowdsourcing relevance magnitudes for information retrieval evaluation

journal contribution
posted on 2024-11-02, 05:29 authored by Eddy Maddalena, Stefano Mizzaro, Falk ScholerFalk Scholer, Andrew Turpin
Magnitude estimation is a psychophysical scaling technique for the measurement of sensation, where observers assign numbers to stimuli in response to their perceived intensity. We investigate the use of magnitude estimation for judging the relevance of documents for information retrieval evaluation, carrying out a large-scale user study across 18 TREC topics and collecting over 50,000 magnitude estimation judgments using crowdsourcing. Our analysis shows that magnitude estimation judgments can be reliably collected using crowdsourcing, are competitive in terms of assessor cost, and are, on average, rank-aligned with ordinal judgments made by expert relevance assessors. We explore the application of magnitude estimation for IR evaluation, calibrating two gain-based effectiveness metrics, nDCG and ERR, directly from user-reported perceptions of relevance. A comparison of TREC system effectiveness rankings based on binary, ordinal, and magnitude estimation relevance shows substantial variation; in particular, the top systems ranked using magnitude estimation and ordinal judgments differ substantially. Analysis of the magnitude estimation scores shows that this effect is due in part to varying perceptions of relevance: different users have different perceptions of the impact of relative differences in document relevance. These results have direct implications for IR evaluation, suggesting that current assumptions about a single view of relevance being sufficient to represent a population of users are unlikely to hold.

Funding

Sub-collection retrieval: understanding and improving search engines

Australian Research Council

Find out more...

History

Related Materials

  1. 1.
    DOI - Is published in 10.1145/3002172
  2. 2.
    ISSN - Is published in 10468188

Journal

ACM Transactions on Information Systems

Volume

35

Number

19

Issue

3

Start page

1

End page

32

Total pages

32

Publisher

Association for Computing Machinery

Place published

United States

Language

English

Copyright

© 2017 ACM.

Former Identifier

2006077033

Esploro creation date

2020-06-22

Fedora creation date

2017-08-22

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC