Back to Search Start Over

Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study.

Authors :
Stekelenburg JJ
Vroomen J
Source :
Brain research [Brain Res] 2015 Nov 11; Vol. 1626, pp. 88-96. Date of Electronic Publication: 2015 Jan 30.
Publication Year :
2015

Abstract

The amplitude of auditory components of the event-related potential (ERP) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one's own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual-auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex. This article is part of a Special Issue entitled SI: Prediction and Attention.<br /> (Copyright © 2015 Elsevier B.V. All rights reserved.)

Details

Language :
English
ISSN :
1872-6240
Volume :
1626
Database :
MEDLINE
Journal :
Brain research
Publication Type :
Academic Journal
Accession number :
25641042
Full Text :
https://doi.org/10.1016/j.brainres.2015.01.036