Back to Search
Start Over
Decoding overt shifts of attention in depth through pupillary and cortical frequency tagging
- Source :
- Journal of neural engineering. 18(3)
- Publication Year :
- 2019
-
Abstract
- We have recently developed a prototype of a novel human-computer interface for assistive communication based on voluntary shifts of attention (gaze) from a far target to a near target associated with a decrease of pupil size (Pupillary Accommodative Response, PAR), an automatic vegetative response that can be easily recorded. We report here an extension of that approach based on pupillary and cortical frequency tagging. Objective. We have recently developed a prototype of a novel human-computer interface for assistive communication based on voluntary shifts of attention (gaze) from a far target to a near target associated with a decrease of pupil size (Pupillary Accommodative Response, PAR), an automatic vegetative response that can be easily recorded. We report here an extension of that approach based on pupillary and cortical frequency tagging. Approach. In 18 healthy volunteers, we investigated the possibility of decoding attention shifts in depth by exploiting the evoked oscillatory responses of the pupil (Pupillary Oscillatory Response, POR, recorded through a low-cost device) and visual cortex (Steady-State Visual Evoked Potentials, SSVEP, recorded from 4 scalp electrodes). With a simple binary communication protocol (focusing on a far target meaning 'No', focusing on the near target meaning 'Yes'), we aimed at discriminating when observer's overt attention (gaze) shifted from the far to the near target, which were flickering at different frequencies. Main results. By applying a binary linear classifier (Support Vector Machine, SVM, with leave-one-out cross validation) to POR and SSVEP signals, we found that, with only twenty trials and no subjects' behavioural training, the offline median decoding accuracy was 75% and 80% with POR and SSVEP signals, respectively. When the two signals were combined together, accuracy reached 83%. The number of observers for whom accuracy was higher than 70% was 11/18, 12/18 and 14/18 with POR, SVVEP and combined features, respectively. A signal detection analysis confirmed these results. Significance. The present findings suggest that exploiting frequency tagging with pupillary or cortical responses during an attention shift in the depth plane, either separately or combined together, is a promising approach to realize a device for communicating with Complete Locked-In Syndrome (CLIS) patients when oculomotor control is unreliable and traditional assistive communication, even based on PAR, is unsuccessful.
- Subjects :
- Computer science
Speech recognition
0206 medical engineering
Biomedical Engineering
assistive communication
attention shifts in depth
brain-computer interface
frequency tagging
locked-in syndrome
pupil oscillations
steady-state visual evoked potentials
Linear classifier
02 engineering and technology
Frequency tagging
Pupil
03 medical and health sciences
Cellular and Molecular Neuroscience
User-Computer Interface
0302 clinical medicine
medicine
Humans
Detection theory
Attention
Brain–computer interface
Attention shifts in depth
Visual Cortex
Flicker
Steady-state visual evoked potentials
Pupil oscillations
Electroencephalography
020601 biomedical engineering
Gaze
Visual cortex
medicine.anatomical_structure
Brain-computer interface
Locked-in syndrome
Brain-Computer Interfaces
Evoked Potentials, Visual
030217 neurology & neurosurgery
Decoding methods
Photic Stimulation
Subjects
Details
- ISSN :
- 17412552
- Volume :
- 18
- Issue :
- 3
- Database :
- OpenAIRE
- Journal :
- Journal of neural engineering
- Accession number :
- edsair.doi.dedup.....35b61e05b4d1e735219501fd10d8e814