Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study

Research output: Contribution to journalArticleScientificpeer-review

Abstract

The amplitude of auditory components of the event-related potential (ERP) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one’s own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual–auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex.
Keywords: Predictive coding, Stimulus omission, Visual–auditory, Motor–auditory
Event-related potentials
Original languageEnglish
Pages (from-to)88–96
JournalBrain Research
Volume1626
DOIs
Publication statusPublished - 2015

Fingerprint

Motor Cortex
Visual Cortex

Cite this

@article{15de199b75934bb7a608e8136b0682b4,
title = "Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study",
abstract = "The amplitude of auditory components of the event-related potential (ERP) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one’s own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual–auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50{\%} or 12{\%} of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex.Keywords: Predictive coding, Stimulus omission, Visual–auditory, Motor–auditoryEvent-related potentials",
author = "J.J. Stekelenburg and J. Vroomen",
year = "2015",
doi = "10.1016/j.brainres.2015.01.036",
language = "English",
volume = "1626",
pages = "88–96",
journal = "Brain Research",
issn = "0006-8993",
publisher = "Elsevier Science BV",

}

Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study. / Stekelenburg, J.J.; Vroomen, J.

In: Brain Research, Vol. 1626, 2015, p. 88–96.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study

AU - Stekelenburg, J.J.

AU - Vroomen, J.

PY - 2015

Y1 - 2015

N2 - The amplitude of auditory components of the event-related potential (ERP) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one’s own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual–auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex.Keywords: Predictive coding, Stimulus omission, Visual–auditory, Motor–auditoryEvent-related potentials

AB - The amplitude of auditory components of the event-related potential (ERP) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one’s own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual–auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex.Keywords: Predictive coding, Stimulus omission, Visual–auditory, Motor–auditoryEvent-related potentials

U2 - 10.1016/j.brainres.2015.01.036

DO - 10.1016/j.brainres.2015.01.036

M3 - Article

VL - 1626

SP - 88

EP - 96

JO - Brain Research

JF - Brain Research

SN - 0006-8993

ER -