Back to Search Start Over

eXnet: An Efficient Approach for Emotion Recognition in the Wild

Authors :
Muhammad Naveed Riaz
Yao Shen
Muhammad Sohail
Minyi Guo
Source :
Sensors, Vol 20, Iss 4, p 1087 (2020)
Publication Year :
2020
Publisher :
MDPI AG, 2020.

Abstract

Facial expression recognition has been well studied for its great importance in the areas of human−computer interaction and social sciences. With the evolution of deep learning, there have been significant advances in this area that also surpass human-level accuracy. Although these methods have achieved good accuracy, they are still suffering from two constraints (high computational power and memory), which are incredibly critical for small hardware-constrained devices. To alleviate this issue, we propose a new Convolutional Neural Network (CNN) architecture eXnet (Expression Net) based on parallel feature extraction which surpasses current methods in accuracy and contains a much smaller number of parameters (eXnet: 4.57 million, VGG19: 14.72 million), making it more efficient and lightweight for real-time systems. Several modern data augmentation techniques are applied for generalization of eXnet; these techniques improve the accuracy of the network by overcoming the problem of overfitting while containing the same size. We provide an extensive evaluation of our network against key methods on Facial Expression Recognition 2013 (FER-2013), Extended Cohn-Kanade Dataset (CK+), and Real-world Affective Faces Database (RAF-DB) benchmark datasets. We also perform ablation evaluation to show the importance of different components of our architecture. To evaluate the efficiency of eXnet on embedded systems, we deploy it on Raspberry Pi 4B. All these evaluations show the superiority of eXnet for emotion recognition in the wild in terms of accuracy, the number of parameters, and size on disk.

Details

Language :
English
ISSN :
14248220
Volume :
20
Issue :
4
Database :
Directory of Open Access Journals
Journal :
Sensors
Publication Type :
Academic Journal
Accession number :
edsdoj.95e59f2e0c3f4f5dab73d2b32ed8ff93
Document Type :
article
Full Text :
https://doi.org/10.3390/s20041087