RMIT University
Browse

User performance versus precision measures for simple search tasks

Download (171.6 kB)
conference contribution
posted on 2024-11-23, 01:53 authored by Andrew Turpin, Falk ScholerFalk Scholer
Several recent studies have demonstrated that the type of improvements in information retrieval system effectiveness reported in forums such as SIGIR and TREC do not translate into a benefit for users. Two of the studies used an instance recall task, and a third used a question answering task, so perhaps it is unsurprising that the precision based measures of IR system effectiveness on one-shot query evaluation do not correlate with user performance on these tasks. In this study, we evaluate two different information retrieval tasks on TREC Web-track data: a precision-based user task, measured by the length of time that users need to find a single document that is relevant to a TREC topic; and, a simple recall-based task, represented by the total number of relevant documents that users can identify within five minutes. Users employ search engines with controlled mean average precision (MAP) of between 55% and 95%. Our results show that there is no significant relationship between system effectiveness measured by MAP and the precision-based task. A significant, but weak relationship is present for the precision at one document returned metric. A weak relationship is present between MAP and the simple recall-based task.

History

Start page

11

End page

18

Total pages

8

Outlet

Proceedings of the 29th annual international ACM SIGIR conference on research and development in information retreival

Editors

S. Dumais, E.N. Efthimiadis, D. Hawking, K. Järvelin

Name of conference

Conference on Research and Development in Information Retreival

Publisher

Association for Computing Machinery (ACM)

Place published

USA

Start date

2006-08-06

End date

2006-08-11

Language

English

Copyright

© 2006 ACM

Former Identifier

2006001961

Esploro creation date

2020-06-22

Fedora creation date

2009-10-08

Open access

  • Yes

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC