Back to Search
Start Over
Deep learning approaches for recognizing facial emotions on autistic patients.
- Source :
- International Journal of Electrical & Computer Engineering (2088-8708); Aug2024, Vol. 14 Issue 4, p4034-4045, 12p
- Publication Year :
- 2024
-
Abstract
- Autistic people need continuous assistance in order to improve their quality of life, and chatbots are one of the technologies that can provide this today. Chatbots can help with this task by providing assistance while accompanying the autist. The chatbot we plan to develop gives to autistic people an immediate personalized recommendation by determining the autist's state, intervene with him and build a profile of the individual that will assist medical professionals in getting to know their patients better so they can provide an individualized care. We attempted to identify the emotion from the image's face in order to gain an understanding of emotions. Deep learning methods like convolutional neural networks and vision transformers could be compared using the FER2013. After optimization, conventional neural network (CNN) achieved 74% accuracy, whereas the vision transformer (ViT) achieved 69%. Given that there is not a massive dataset of autistic individuals accessible, we combined a dataset of photos of autistic people from two distinct sources and used the CNN model to identify the relevant emotion. Our accuracy rate for identifying emotions on the face is 65%. The model still has some identification limitations, such as misinterpreting some emotions, particularly "neutral," "surprised," and "angry," because these emotions and facial traits are poorly expressed by autistic people, and because the model is trained with imbalanced emotion categories. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 20888708
- Volume :
- 14
- Issue :
- 4
- Database :
- Complementary Index
- Journal :
- International Journal of Electrical & Computer Engineering (2088-8708)
- Publication Type :
- Academic Journal
- Accession number :
- 178843296
- Full Text :
- https://doi.org/10.11591/ijece.v14i4.pp4034-4045