Back to Search
Start Over
VEmotion: Using Driving Context for Indirect Emotion Prediction in Real-Time
- Source :
- UIST
- Publication Year :
- 2021
- Publisher :
- ACM, 2021.
-
Abstract
- Detecting emotions while driving remains a challenge in Human-Computer Interaction. Current methods to estimate the driver’s experienced emotions use physiological sensing (e.g., skin-conductance, electroencephalography), speech, or facial expressions. However, drivers need to use wearable devices, perform explicit voice interaction, or require robust facial expressiveness. We present VEmotion (Virtual Emotion Sensor), a novel method to predict driver emotions in an unobtrusive way using contextual smartphone data. VEmotion analyzes information including traffic dynamics, environmental factors, in-vehicle context, and road characteristics to implicitly classify driver emotions. We demonstrate the applicability in a real-world driving study (N = 12) to evaluate the emotion prediction performance. Our results show that VEmotion outperforms facial expressions by 29% in a person-dependent classification and by 8.5% in a person-independent classification. We discuss how VEmotion enables empathic car interfaces to sense the driver’s emotions and will provide in-situ interface adaptations on-the-go.
Details
- Database :
- OpenAIRE
- Journal :
- The 34th Annual ACM Symposium on User Interface Software and Technology
- Accession number :
- edsair.doi...........b9f14acc8d4c2554600fb96eb1cc4659
- Full Text :
- https://doi.org/10.1145/3472749.3474775