Back to Search Start Over

Convolutional neural networks can identify brain interactions involved in decoding spatial auditory attention.

Authors :
Mahjoory, Keyvan
Bahmer, Andreas
Henry, Molly J.
Source :
PLoS Computational Biology. 8/8/2024, Vol. 20 Issue 8, p1-22. 22p.
Publication Year :
2024

Abstract

Human listeners have the ability to direct their attention to a single speaker in a multi-talker environment. The neural correlates of selective attention can be decoded from a single trial of electroencephalography (EEG) data. In this study, leveraging the source-reconstructed and anatomically-resolved EEG data as inputs, we sought to employ CNN as an interpretable model to uncover task-specific interactions between brain regions, rather than simply to utilize it as a black box decoder. To this end, our CNN model was specifically designed to learn pairwise interaction representations for 10 cortical regions from five-second inputs. By exclusively utilizing these features for decoding, our model was able to attain a median accuracy of 77.56% for within-participant and 65.14% for cross-participant classification. Through ablation analysis together with dissecting the features of the models and applying cluster analysis, we were able to discern the presence of alpha-band-dominated inter-hemisphere interactions, as well as alpha- and beta-band dominant interactions that were either hemisphere-specific or were characterized by a contrasting pattern between the right and left hemispheres. These interactions were more pronounced in parietal and central regions for within-participant decoding, but in parietal, central, and partly frontal regions for cross-participant decoding. These findings demonstrate that our CNN model can effectively utilize features known to be important in auditory attention tasks and suggest that the application of domain knowledge inspired CNNs on source-reconstructed EEG data can offer a novel computational framework for studying task-relevant brain interactions. Author summary: In our study, we explored how the brain manages to focus on one speaker among many, a common challenge in noisy environments. Using advanced brainwave (EEG) data analysis, we developed a new method to understand how different parts of the brain communicate during this task. Our technique involves a type of artificial intelligence known as a convolutional neural network (CNN). However, instead of using it as a black box, we designed it to specifically reveal how ten brain areas work together in pairs during selective listening. Remarkably, our approach achieved a high level of accuracy in recognizing where someone's attention was directed, based on EEG data alone. We discovered that certain patterns of brain activity, especially in regions known for processing sound and spatial awareness, are crucial for focusing attention. This study not only advances our understanding of the brain's attention mechanisms but also introduces a promising tool for examining complex brain functions, offering insights that could help develop new tools for improving listening in challenging environments. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1553734X
Volume :
20
Issue :
8
Database :
Academic Search Index
Journal :
PLoS Computational Biology
Publication Type :
Academic Journal
Accession number :
178915048
Full Text :
https://doi.org/10.1371/journal.pcbi.1012376