Back to Search Start Over

Two End-to-End Quantum-Inspired Deep Neural Networks for Text Classification

Authors :
Shi, Jinjing
Li, Zhenhuan
Lai, Wei
Li, Fangfang
Shi, Ronghua
Feng, Yanyan
Zhang, Shichao
Source :
IEEE Transactions on Knowledge and Data Engineering; 2023, Vol. 35 Issue: 4 p4335-4345, 11p
Publication Year :
2023

Abstract

In linguistics, the uncertainty of context due to polysemy is widespread, which attracts much attention. Quantum-inspired complex word embedding based on Hilbert space plays an important role in natural language processing (NLP), which fully leverages the similarity between quantum states and word tokens. A word containing multiple meanings could correspond to a single quantum particle which may exist in several possible states, and a sentence could be analogous to the quantum system where particles interfere with each other. Motivated by quantum-inspired complex word embedding, interpretable complex-valued word embedding (ICWE) is proposed to design two end-to-end quantum-inspired deep neural networks (ICWE-QNN and CICWE-QNN representing convolutional complex-valued neural network based on ICWE) for binary text classification. They have the proven feasibility and effectiveness in the application of NLP and can solve the problem of text information loss in CE-Mix [1] model caused by neglecting the important linguistic features of text, since linguistic feature extraction is presented in our model with deep learning algorithms, in which gated recurrent unit (GRU) extracts the sequence information of sentences, attention mechanism makes the model focus on important words in sentences and convolutional layer captures the local features of projected matrix. The model ICWE-QNN can avoid random combination of word tokens and CICWE-QNN fully considers textual features of the projected matrix. Experiments conducted on five benchmarking classification datasets demonstrate our proposed models have higher accuracy than the compared traditional models including CaptionRep BOW, DictRep BOW and Paragram-Phrase, and they also have great performance on F1-score. Eespecially, CICWE-QNN model has higher accuracy than the quantum-inspired model CE-Mix as well for four datasets including SST, SUBJ, CR and MPQA. It is a meaningful and effictive exploration to design quantum-inspired deep neural networks to promote the performance of text classification.

Details

Language :
English
ISSN :
10414347 and 15582191
Volume :
35
Issue :
4
Database :
Supplemental Index
Journal :
IEEE Transactions on Knowledge and Data Engineering
Publication Type :
Periodical
Accession number :
ejs62453232
Full Text :
https://doi.org/10.1109/TKDE.2021.3130598