Back to Search
Start Over
EmoReact: a multimodal approach and dataset for recognizing emotional responses in children
- Source :
- ICMI
- Publication Year :
- 2016
- Publisher :
- ACM, 2016.
-
Abstract
- Automatic emotion recognition plays a central role in the technologies underlying social robots, affect-sensitive human computer interaction design and affect-aware tutors. Although there has been a considerable amount of research on automatic emotion recognition in adults, emotion recognition in children has been understudied. This problem is more challenging as children tend to fidget and move around more than adults, leading to more self-occlusions and non-frontal head poses. Also, the lack of publicly available datasets for children with annotated emotion labels leads most researchers to focus on adults. In this paper, we introduce a newly collected multimodal emotion dataset of children between the ages of four and fourteen years old. The dataset contains 1102 audio-visual clips annotated for 17 different emotional states: six basic emotions, neutral, valence and nine complex emotions including curiosity, uncertainty and frustration. Our experiments compare unimodal and multimodal emotion recognition baseline models to enable future research on this topic. Finally, we present a detailed analysis of the most indicative behavioral cues for emotion recognition in children.
- Subjects :
- Social robot
Computer science
media_common.quotation_subject
Emotion classification
05 social sciences
Multimodal therapy
02 engineering and technology
Facial analysis
0202 electrical engineering, electronic engineering, information engineering
Curiosity
020201 artificial intelligence & image processing
0501 psychology and cognitive sciences
Emotion recognition
Valence (psychology)
050104 developmental & child psychology
media_common
Cognitive psychology
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of the 18th ACM International Conference on Multimodal Interaction
- Accession number :
- edsair.doi...........95bcea4590acac3999610514e161a694
- Full Text :
- https://doi.org/10.1145/2993148.2993168