Back to Search Start Over

Attention-Based Convolution Skip Bidirectional Long Short-Term Memory Network for Speech Emotion Recognition

Authors :
Huiyun Zhang
Heming Huang
Henry Han
Source :
IEEE Access, Vol 9, Pp 5332-5342 (2021)
Publication Year :
2021
Publisher :
IEEE, 2021.

Abstract

Speech emotion recognition is a challenging task in natural language processing. It relies heavily on the effectiveness of speech features and acoustic models. However, existing acoustic models may not handle speech emotion recognition efficiently for their built-in limitations. In this work, a novel deep-learning acoustic model called attention-based skip convolution bi-directional long short-term memory, abbreviated as SCBAMM, is proposed to recognize speech emotion. It has eight hidden layers, namely, two dense layers, convolutional layer, skip layer, mask layer, Bi-LSTM layer, attention layer, and pooling layer. SCBAMM makes better use of spatiotemporal information and captures emotion-related features more effectively. In addition, it solves the problems of gradient exploding and gradient vanishing in deep learning to some extent. On the databases EMO-DB and CASIA, the proposed model SCBAMM achieves an accuracy rate of 94.58% and 72.50%, respectively. As far as we know, compared with peer models, this is the best accuracy rate.

Details

Language :
English
ISSN :
21693536
Volume :
9
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.1daf24bcc12e4c2b89fa6be079a35f76
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2020.3047395