RMIT University
Browse

Identifying re-finding difficulty from user query logs

conference contribution
posted on 2024-10-31, 18:38 authored by Seyedeh Sadeghi, Roi Blanco Gonzalez, Peter Mika, Mark SandersonMark Sanderson, Falk ScholerFalk Scholer, David Vallet
This paper presents a first study of how consistently human assessors are able to identify, from query logs, when searchers are facing difficulties re-finding documents. Using 12 assessors, we investigate the effect of two variables on assessor agreement: the assessment guideline detail, and assessor experience. The results indicate statistically significant better agreement when using detailed guidelines. An upper agreement of 78.9% was achieved, which is comparable to the levels of agreement in other information retrieval contexts. The effects of two contextual factors, representative of system performance and user effort, were studied. Significant differences between agreement levels were found for both factors, suggesting that contextual factors may play an important role in obtaining higher agreement levels. The findings contribute to a better understanding of how to generate ground truth data both in the re-finding and other labeling contexts, and have further implications for building automatic re-finding difficulty prediction models.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1145/2682862.2682867
  2. 2.
    ISBN - Is published in 9781450330008 (urn:isbn:9781450330008)

Start page

105

End page

108

Total pages

4

Outlet

Proceedings of the 19th Australasian Document Computing Symposium (ADCS 2014)

Editors

J. Shane Culpepper, Laurence Park, and Guido Zuccon

Name of conference

ADCS 2014

Publisher

Association for Computing Machinery

Place published

New York, United States

Start date

2014-11-27

End date

2014-11-28

Language

English

Copyright

Copyright is held by the owner/author(s). Publication rights licensed to ACM.

Former Identifier

2006053599

Esploro creation date

2020-06-22

Fedora creation date

2015-06-23

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC