Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing

Martijn Baart, Arthur Samuel

Research output: Contribution to journalArticleScientificpeer-review

23 Citations (Scopus)

Abstract

Electrophysiological research has shown that pseudowords elicit more negative Event-Related Potentials (i.e., ERPs) than words within 250ms after the lexical status of a speech token is defined (e.g., after hearing the onset of “ga” in the Spanish word “lechuga”, versus “da” in the pseudoword “lechuda”). Since lip-read context also affects speech sound processing within this time frame, we investigated whether these two context effects on speech perception operate together. We measured ERPs while listeners were presented with auditory-only, audiovisual, or lip-read-only stimuli, in which the critical syllable that determined lexical status was naturally-timed (Experiment 1) or delayed by ∼800 ms (Experiment 2). We replicated the electrophysiological effect of stimulus lexicality, and also observed substantial effects of audiovisual speech integration for words and pseudowords. Critically, we found several early time-windows (< 400 ms) in which both contexts influenced auditory processes, but we never observed any cross-talk between the two types of speech context. The absence of any interaction between the two types of speech context supports the view that lip-read and lexical context mainly function separately, and may have different neural bases and purposes.
Original languageEnglish
Pages (from-to)42-59
JournalJournal of Memory and Language
Volume85
DOIs
Publication statusPublished - 2015
Externally publishedYes

Fingerprint

Dive into the research topics of 'Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing'. Together they form a unique fingerprint.

Cite this