1. Task Load Estimation from Multimodal Head-Worn Sensors Using Event Sequence Features
- Author
-
Siyuan Chen and Julien Epps
- Subjects
Sequence ,Modalities ,business.industry ,Computer science ,Event (computing) ,Head (linguistics) ,05 social sciences ,Novelty ,020207 software engineering ,Pattern recognition ,02 engineering and technology ,Task (project management) ,Human-Computer Interaction ,Variable (computer science) ,0202 electrical engineering, electronic engineering, information engineering ,Task analysis ,0501 psychology and cognitive sciences ,Artificial intelligence ,business ,050107 human factors ,Software - Abstract
For longitudinal behavior analysis, task type is an inevitable and important variable. In this article, we propose an event-based behavior modeling approach and employ non-invasive wearable sensing modalities (eye activity, speech and head movement) to recognize task load level under four different task load types. The novelty lies in converting physiological and behavioral signals into meaningful events and utilizing their sequence across multiple modalities to distinguish load levels and types. We evaluated this approach on head-worn sensor data from 24 participants completing four different tasks for recognizing (i) low and high load level for a given task load type, (ii) low and high load level regardless of load type, and (iii) both load level and load type. Findings show that the recognition rate is reasonable in (i), close to chance level in (ii), and well above chance level in (iii) for 8 classes using participant-dependent and -independent schemes. Further, a fusion of the proposed event-based features and conventional continuous features achieved the best or similar performance in most cases. These results suggest that task type needs to be considered when using continuous features and that the proposed event-based modeling paradigm is promising for longitudinal behavior analysis.
- Published
- 2021