Back to Search Start Over

Sentiment analysis based on Chinese BERT and fused deep neural networks for sentence-level Chinese e-commerce product reviews.

Authors :
Fang, Hong
Jiang, Guangjie
Li, Desheng
Source :
Systems Science & Control Engineering; Dec2022, Vol. 10 Issue 1, p802-810, 9p
Publication Year :
2022

Abstract

Driven by the rapid development of Internet, more e-commerce product reviews are available on e-commerce platforms, which can help enterprises make business decisions. Currently, bidirectional encoder representations from transformers (BERT) applied in the embedding layer contributes to achieve promising results in English text sentiment analysis (SA). This paper proposes a novel model Chinese BERT with fused deep neural networks (CBERT-FDNN), extracting richer and more accurate semantic and grammatical information in Chinese text. First, Chinese BERT with whole word masking (Chinese-BERT-wwm) is used in the embedding layer to generate dynamic sentence representation vectors. It is a Chinese pre-training model based on the whole word masking (WWM) technology, which is more effective for Chinese text contextual embedding. Second, multi-channel and multi-scale convolutional neural networks (CNN) and bidirectional long short-term memory (BiLSTM) are designed to capture further crucial features in the feature extraction layer. To obtain more comprehensive sentence attributes, these features are concatenated together. Last, the model is evaluated on 100,000 sentence-level Chinese e-commerce product reviews for sentiment binary classification. The accuracy and F1 score can achieve 94.37% and 94.34%, respectively. Compared with the baseline models, the experiments show that our proposed model has higher accuracy and better prediction performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
21642583
Volume :
10
Issue :
1
Database :
Complementary Index
Journal :
Systems Science & Control Engineering
Publication Type :
Academic Journal
Accession number :
160969605
Full Text :
https://doi.org/10.1080/21642583.2022.2123060