Back to Search
Start Over
A multi-componential analysis of emotions during complex learning with an intelligent multi-agent system.
- Source :
-
Computers in Human Behavior . Jul2015, Vol. 48, p615-625. 11p. - Publication Year :
- 2015
-
Abstract
- This paper presents the evaluation of the synchronization of three emotional measurement methods (automatic facial expression recognition, self-report, electrodermal activity) and their agreement regarding learners’ emotions. Data were collected from 67 undergraduates enrolled at a North American University whom learned about a complex science topic while interacting with MetaTutor, a multi-agent computerized learning environment. Videos of learners’ facial expressions captured with a webcam were analyzed using automatic facial recognition software (FaceReader 5.0). Learners’ physiological arousal was recorded using Affectiva’s Q-Sensor 2.0 electrodermal activity measurement bracelet. Learners’ self-reported their experience of 19 different emotional states on five different occasions during the learning session, which were used as markers to synchronize data from FaceReader and Q-Sensor. We found a high agreement between the facial and self-report data (75.6%), but low levels of agreement between them and the Q-Sensor data, suggesting that a tightly coupled relationship does not always exist between emotional response components. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 07475632
- Volume :
- 48
- Database :
- Academic Search Index
- Journal :
- Computers in Human Behavior
- Publication Type :
- Academic Journal
- Accession number :
- 101926560
- Full Text :
- https://doi.org/10.1016/j.chb.2015.02.013