Back to Search
Start Over
Large Scale Functional Brain Networks Underlying Temporal Integration of Audio-Visual Speech Perception: An EEG Study
- Source :
- Frontiers in Psychology, Frontiers in Psychology, Vol 7 (2016)
- Publication Year :
- 2016
- Publisher :
- Frontiers Media S.A., 2016.
-
Abstract
- Observable lip movements of the speaker influence perception of auditory speech. A classical example of this influence is reported by listeners who perceive an illusory (cross-modal) speech sound (McGurk-effect) when presented with incongruent audio-visual (AV) speech stimuli. Recent neuroimaging studies of AV speech perception accentuate the role of frontal, parietal, and the integrative brain sites in the vicinity of the superior temporal sulcus (STS) for multisensory speech perception. However, if and how does the network across the whole brain participates during multisensory perception processing remains an open question. We posit that a large-scale functional connectivity among the neural population situated in distributed brain sites may provide valuable insights involved in processing and fusing of AV speech. Varying the psychophysical parameters in tandem with electroencephalogram (EEG) recordings, we exploited the trial-by-trial perceptual variability of incongruent audio-visual (AV) speech stimuli to identify the characteristics of the large-scale cortical network that facilitates multisensory perception during synchronous and asynchronous AV speech. We evaluated the spectral landscape of EEG signals during multisensory speech perception at varying AV lags. Functional connectivity dynamics for all sensor pairs was computed using the time-frequency global coherence, the vector sum of pairwise coherence changes over time. During synchronous AV speech, we observed enhanced global gamma-band coherence and decreased alpha and beta-band coherence underlying cross-modal (illusory) perception compared to unisensory perception around a temporal window of 300–600 ms following onset of stimuli. During asynchronous speech stimuli, a global broadband coherence was observed during cross-modal perception at earlier times along with pre-stimulus decreases of lower frequency power, e.g., alpha rhythms for positive AV lags and theta rhythms for negative AV lags. Thus, our study indicates that the temporal integration underlying multisensory speech perception requires to be understood in the framework of large-scale functional brain network mechanisms in addition to the established cortical loci of multisensory speech perception.
- Subjects :
- Speech perception
genetic structures
media_common.quotation_subject
Speech recognition
lcsh:BF1-990
integration
Electroencephalography
perception
050105 experimental psychology
03 medical and health sciences
0302 clinical medicine
Perception
medicine
otorhinolaryngologic diseases
Psychology
0501 psychology and cognitive sciences
EEG
General Psychology
media_common
Original Research
Categorical perception
Crossmodal
medicine.diagnostic_test
05 social sciences
functional connectivity
temporal synchrony
Coherence (statistics)
Superior temporal sulcus
coherence
multisensory
lcsh:Psychology
Neurocomputational speech processing
AV
030217 neurology & neurosurgery
psychological phenomena and processes
Subjects
Details
- Language :
- English
- ISSN :
- 16641078
- Volume :
- 7
- Database :
- OpenAIRE
- Journal :
- Frontiers in Psychology
- Accession number :
- edsair.doi.dedup.....25afe74d497df6bfa3ea36cc259319d8