Back to Search Start Over

The structure and statistics of language jointly shape cross-frequency neural dynamics during spoken language comprehension.

Authors :
Weissbart H
Martin AE
Source :
Nature communications [Nat Commun] 2024 Oct 14; Vol. 15 (1), pp. 8850. Date of Electronic Publication: 2024 Oct 14.
Publication Year :
2024

Abstract

Humans excel at extracting structurally-determined meaning from speech despite inherent physical variability. This study explores the brain's ability to predict and understand spoken language robustly. It investigates the relationship between structural and statistical language knowledge in brain dynamics, focusing on phase and amplitude modulation. Using syntactic features from constituent hierarchies and surface statistics from a transformer model as predictors of forward encoding models, we reconstructed cross-frequency neural dynamics from MEG data during audiobook listening. Our findings challenge a strict separation of linguistic structure and statistics in the brain, with both aiding neural signal reconstruction. Syntactic features have a more temporally spread impact, and both word entropy and the number of closing syntactic constituents are linked to the phase-amplitude coupling of neural dynamics, implying a role in temporal prediction and cortical oscillation alignment during speech processing. Our results indicate that structured and statistical information jointly shape neural dynamics during spoken language comprehension and suggest an integration process via a cross-frequency coupling mechanism.<br /> (© 2024. The Author(s).)

Details

Language :
English
ISSN :
2041-1723
Volume :
15
Issue :
1
Database :
MEDLINE
Journal :
Nature communications
Publication Type :
Academic Journal
Accession number :
39397036
Full Text :
https://doi.org/10.1038/s41467-024-53128-1