Back to Search Start Over

Hierarchical Bi-Directional Self-Attention Networks for Paper Review Rating Recommendation

Authors :
Hao Peng
Congying Xia
Jianxin Li
Philip S. Yu
Zhongfen Deng
Lifang He
Source :
COLING
Publication Year :
2020
Publisher :
International Committee on Computational Linguistics, 2020.

Abstract

Review rating prediction of text reviews is a rapidly growing technology with a wide range of applications in natural language processing. However, most existing methods either use hand-crafted features or learn features using deep learning with simple text corpus as input for review rating prediction, ignoring the hierarchies among data. In this paper, we propose a Hierarchical bi-directional self-attention Network framework (HabNet) for paper review rating prediction and recommendation, which can serve as an effective decision-making tool for the academic paper review process. Specifically, we leverage the hierarchical structure of the paper reviews with three levels of encoders: sentence encoder (level one), intra-review encoder (level two) and inter-review encoder (level three). Each encoder first derives contextual representation of each level, then generates a higher-level representation, and after the learning process, we are able to identify useful predictors to make the final acceptance decision, as well as to help discover the inconsistency between numerical review ratings and text sentiment conveyed by reviewers. Furthermore, we introduce two new metrics to evaluate models in data imbalance situations. Extensive experiments on a publicly available dataset (PeerRead) and our own collected dataset (OpenReview) demonstrate the superiority of the proposed approach compared with state-of-the-art methods.<br />Accepted by COLING 2020

Details

Database :
OpenAIRE
Journal :
Proceedings of the 28th International Conference on Computational Linguistics
Accession number :
edsair.doi.dedup.....9d170289177862113884c76d33fe03ca