Back to Search Start Over

Exploiting Rich Textual User-Product Context for Improving Sentiment Analysis

Authors :
Lyu, Chenyang
Yang, Linyi
Zhang, Yue
Graham, Yvette
Foster, Jennifer
Publication Year :
2022

Abstract

User and product information associated with a review is useful for sentiment polarity prediction. Typical approaches incorporating such information focus on modeling users and products as implicitly learned representation vectors. Most do not exploit the potential of historical reviews, or those that currently do require unnecessary modifications to model architecture or do not make full use of user/product associations. The contribution of this work is twofold: i) a method to explicitly employ historical reviews belonging to the same user/product to initialize representations, and ii) efficient incorporation of textual associations between users and products via a user-product cross-context module. Experiments on IMDb, Yelp-2013 and Yelp-2014 benchmarks show that our approach substantially outperforms previous state-of-the-art. Since we employ BERT-base as the encoder, we additionally provide experiments in which our approach performs well with Span-BERT and Longformer. Furthermore, experiments where the reviews of each user/product in the training data are downsampled demonstrate the effectiveness of our approach under a low-resource setting.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2212.08888
Document Type :
Working Paper