1. Recognition of Empathy from Synchronization between Brain Activity and Eye Movement.
- Author
-
Zhang, Jing, Park, Sung, Cho, Ayoung, and Whang, Mincheol
- Subjects
- *
EYE movements , *EMPATHY , *BRAIN waves , *USER-generated content , *SYNCHRONIZATION , *SHARED virtual environments , *PREFRONTAL cortex - Abstract
In the era of user-generated content (UGC) and virtual interactions within the metaverse, empathic digital content has become increasingly important. This study aimed to quantify human empathy levels when exposed to digital media. To assess empathy, we analyzed brain wave activity and eye movements in response to emotional videos. Forty-seven participants watched eight emotional videos, and we collected their brain activity and eye movement data during the viewing. After each video session, participants provided subjective evaluations. Our analysis focused on the relationship between brain activity and eye movement in recognizing empathy. The findings revealed the following: (1) Participants were more inclined to empathize with videos depicting pleasant-arousal and unpleasant-relaxed emotions. (2) Saccades and fixation, key components of eye movement, occurred simultaneously with specific channels in the prefrontal and temporal lobes. (3) Eigenvalues of brain activity and pupil changes showed synchronization between the right pupil and certain channels in the prefrontal, parietal, and temporal lobes during empathic responses. These results suggest that eye movement characteristics can serve as an indicator of the cognitive empathic process when engaging with digital content. Furthermore, the observed changes in pupil size result from a combination of emotional and cognitive empathy elicited by the videos. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF