RMIT University
Browse

Ranking Documents by Answer-Passage Quality

conference contribution
posted on 2024-11-03, 11:51 authored by Evi Yulianti, Ruey-Cheng Chen, Falk ScholerFalk Scholer, Bruce Croft, Mark SandersonMark Sanderson
Evidence derived from passages that closely represent likely answers to a posed query can be useful input to the ranking process. Based on a novel use of Community Question Answering data, we present an approach for the creation of such passages. A general framework for extracting answer passages and estimating their quality is proposed, and this evidence is integrated into ranking models. Our experiments on two web collections show that such quality estimates from answer passages provide a strong indication of document relevance and compare favorably to previous passage-based methods. Combining such evidence can significantly improve over a set of state-of-the-art ranking models, including Quality-Biased Ranking, External Expansion, and a combination of both. A final ranking model that incorporates all quality estimates achieves further improvements on both collections.

Funding

Effective summaries for search results

Australian Research Council

Find out more...

Continuous and summarised search over evolving heterogeneous data

Australian Research Council

Find out more...

History

Start page

335

End page

344

Total pages

10

Outlet

Proceedings of the 41st International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2018)

Name of conference

SIGIR 2018

Publisher

Association for Computing Machinery

Place published

New York, United States

Start date

2018-07-08

End date

2018-07-12

Language

English

Copyright

© 2018 Association for Computing Machinery.

Former Identifier

2006088437

Esploro creation date

2020-06-22

Fedora creation date

2019-02-21

Usage metrics

    Scholarly Works

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC