RMIT University
Browse

Ranking Interruptus: When Truncated Rankings Are Better and How to Measure That

conference contribution
posted on 2024-11-03, 14:54 authored by Enrique Amigó, Stefano Mizzaro, Damiano SpinaDamiano Spina
Most of information retrieval effectiveness evaluation metrics assume that systems appending irrelevant documents at the bottom of the ranking are as effective as (or not worse than) systems that have a stopping criteria to 'truncate' the ranking at the right position to avoid retrieving those irrelevant documents at the end. It can be argued, however, that such truncated rankings are more useful to the end user. It is thus important to understand how to measure retrieval effectiveness in this scenario. In this paper we provide both theoretical and experimental contributions. We first define formal properties to analyze how effectiveness metrics behave when evaluating truncated rankings. Our theoretical analysis shows that de-facto standard metrics do not satisfy desirable properties to evaluate truncated rankings: only Observational Information Effectiveness (OIE) -- a metric based on Shannon's information theory -- satisfies them all. We then perform experiments to compare several metrics on nine TREC datasets. According to our experimental results, the most appropriate metrics for truncated rankings are OIE and a novel extension of Rank-Biased Precision that adds a user effort factor penalizing the retrieval of irrelevant documents.

Funding

Fair and Transparent Information Access in Spoken Conversational Assistants

Australian Research Council

Find out more...

History

Related Materials

  1. 1.
    DOI - Is published in 10.1145/3477495.3532051
  2. 2.
    ISBN - Is published in 9781450387323 (urn:isbn:9781450387323)

Start page

588

End page

598

Total pages

11

Outlet

SIGIR '22: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval

Name of conference

SIGIR '22: The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval

Publisher

ACM

Place published

United States

Start date

2022-07-11

End date

2022-07-15

Language

English

Copyright

© 2022

Former Identifier

2006116439

Esploro creation date

2022-11-26

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC