RMIT University
Browse

A time and opinion quality-weighted model for aggregating online reviews

conference contribution
posted on 2024-10-31, 19:41 authored by Yassien Shaalan, Xiuzhen ZhangXiuzhen Zhang
Online reviews are playing important roles for the online shoppers to make buying decisions. However, reading all or most of the reviews is an overwhelming and time consuming task. Many online shopping websites provide aggregate scores for products to help consumers to make decisions. Averaging star ratings from all online reviews is widely used but is hardly effective for ranking products. Recent research proposed weighted aggregation models, where weighting heuristics include opinion polarities from mining review textual contents as well as distribution of star ratings. But the quality of opinions in reviews is largely ignored in existing aggregation models. In this paper we propose a novel review weighting model combining the information on the posting time and opinion quality of reviews. In particular, we make use of helpfulness votes for reviews from the online review communities to measure opinion quality. Our model generates aggregate scores to rank products. Extensive experiments on an Amazon dataset showed that our model ranked products in strong correspondence with customer purchase rank and outperformed several other approaches.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1007/978-3-319-46922-5_21
  2. 2.
    ISBN - Is published in 9783319469218 (urn:isbn:9783319469218)

Start page

269

End page

282

Total pages

14

Outlet

Proceedings of the 27th Australasian Database Conference (ADC 2016)

Editors

Muhammad Aamir Cheema, Wenjie Zhang, Lijun Chang

Name of conference

ADC 2016: Databases Theory and Applications

Publisher

Springer

Place published

Switzerland

Start date

2016-09-28

End date

2016-09-29

Language

English

Copyright

© Springer International Publishing AG 2016

Former Identifier

2006067088

Esploro creation date

2020-06-22

Fedora creation date

2016-10-18

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC