Back to Search Start Over

Lip-reading enables the brain to synthesize auditory features of unknown silent speech

Authors :
Bourguignon, Mathieu
Baart, Martijn
Kapnoula, Efthymia
Molinaro, Nicola
Bourguignon, Mathieu
Baart, Martijn
Kapnoula, Efthymia
Molinaro, Nicola
Source :
Journal of Neuroscience vol.40 (2020) nr.5 p.1053-1065 [ISSN 0270-6474]
Publication Year :
2020

Abstract

Lip-reading is crucial for understanding speech in challenging conditions. But how the brain extracts meaning from—-silent—-visual speech is still under debate. Lip-reading in silence activates the auditory cortices, but it is not known whether such activation reflects immediate synthesis of the corresponding auditory stimulus or imagery of unrelated sounds. To disentangle these possibilities, we used magnetoencephalography to evaluate how cortical activity in 28 healthy adults humans (17 females) entrained to the auditory speech envelope and lip movements (mouth opening) when listening to a spoken story without visual input (audio-only), and when seeing a silent video of a speaker articulating another story (video-only). In video-only, auditory cortical activity entrained to the absent auditory signal at frequencies below 1 Hz more than to the seen lip movements. This entrainment process was characterized by an auditory-speech—to—brain delay of ∼70 ms in the left hemisphere, compared to ∼20 ms in audio-only. Entrainment to mouth opening was found in the right angular gyrus at below 1 Hz, and in early visual cortices at 1—8 Hz. These findings demonstrate that the brain can use a silent lip-read signal to synthesize a coarse-grained auditory speech representation in early auditory cortices. Our data indicate the following underlying oscillatory mechanism: Seeing lip movements first modulates neuronal activity in early visual cortices at frequencies that match articulatory lip movements; the right angular gyrus then extracts slower features of lip movements, mapping them onto the corresponding speech sound features; this information is fed to auditory cortices, most likely facilitating speech parsing.

Details

Database :
OAIster
Journal :
Journal of Neuroscience vol.40 (2020) nr.5 p.1053-1065 [ISSN 0270-6474]
Notes :
DOI: 10.1523/JNEUROSCI.1101-19.2019, Journal of Neuroscience vol.40 (2020) nr.5 p.1053-1065 [ISSN 0270-6474], English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1131098184
Document Type :
Electronic Resource