Back to Search Start Over

Multimodal Classification of Stressful Environments in Visually Impaired Mobility Using EEG and Peripheral Biosignals.

Authors :
Saitis, Charalampos
Kalimeri, Kyriaki
Source :
IEEE Transactions on Affective Computing; Jan-Mar2021, Vol. 12 Issue 1, p203-214, 12p
Publication Year :
2021

Abstract

In this study, we aim to better understand the cognitive-emotional experience of visually impaired people when navigating in unfamiliar urban environments, both outdoor and indoor. We propose a multimodal framework based on random forest classifiers, which predict the actual environment among predefined generic classes of urban settings, inferring on real-time, non-invasive, ambulatory monitoring of brain and peripheral biosignals. Model performance reached 93 for the outdoor and 87 percent for the indoor environments (expressed in weighted AUROC), demonstrating the potential of the approach. Estimating the density distributions of the most predictive biomarkers, we present a series of geographic and temporal visualizations depicting the environmental contexts in which the most intense affective and cognitive reactions take place. A linear mixed model analysis revealed significant differences between categories of vision impairment, but not between normal and impaired vision. Despite the limited size of our cohort, these findings pave the way to emotionally intelligent mobility-enhancing systems, capable of implicit adaptation not only to changing environments but also to shifts in the affective state of the user in relation to different environmental and situational factors. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19493045
Volume :
12
Issue :
1
Database :
Complementary Index
Journal :
IEEE Transactions on Affective Computing
Publication Type :
Academic Journal
Accession number :
148970273
Full Text :
https://doi.org/10.1109/TAFFC.2018.2866865