Back to Search Start Over

Attention-Based Convolution Skip Bidirectional Long Short-Term Memory Network for Speech Emotion Recognition

Authors :
Henry Han
Huiyun Zhang
Heming Huang
Source :
IEEE Access, Vol 9, Pp 5332-5342 (2021)
Publication Year :
2021
Publisher :
IEEE, 2021.

Abstract

Speech emotion recognition is a challenging task in natural language processing. It relies heavily on the effectiveness of speech features and acoustic models. However, existing acoustic models may not handle speech emotion recognition efficiently for their built-in limitations. In this work, a novel deep-learning acoustic model called attention-based skip convolution bi-directional long short-term memory, abbreviated as SCBAMM, is proposed to recognize speech emotion. It has eight hidden layers, namely, two dense layers, convolutional layer, skip layer, mask layer, Bi-LSTM layer, attention layer, and pooling layer. SCBAMM makes better use of spatiotemporal information and captures emotion-related features more effectively. In addition, it solves the problems of gradient exploding and gradient vanishing in deep learning to some extent. On the databases EMO-DB and CASIA, the proposed model SCBAMM achieves an accuracy rate of 94.58% and 72.50%, respectively. As far as we know, compared with peer models, this is the best accuracy rate.

Details

Language :
English
ISSN :
21693536
Volume :
9
Database :
OpenAIRE
Journal :
IEEE Access
Accession number :
edsair.doi.dedup.....6a487feb6679724b0461391d987797b9