When a hit sounds like a kiss: An electrophysiological exploration of semantic processing in visual narrative

Mirella Manfredi, Neil Cohn, Marta Kutas

Research output: Contribution to journalArticleScientificpeer-review

29 Citations (Scopus)
124 Downloads (Pure)

Abstract

Researchers have long questioned whether information presented through different sensory modalities involves distinct or shared semantic systems. We investigated uni-sensory cross-modal processing by recording event-related brain potentials to words replacing the climactic event in a visual narrative sequence (comics). We compared Onomatopoeic words, which phonetically imitate action sounds (Pow!), with Descriptive words, which describe an action (Punch!), that were (in)congruent within their sequence contexts. Across two experiments, larger N400s appeared to Anomalous Onomatopoeic or Descriptive critical panels than to their congruent counterparts, reflecting a difficulty in semantic access/retrieval. Also, Descriptive words evinced a greater late frontal positivity compared to Onomatopoetic words, suggesting that, though plausible, they may be less predictable/expected in visual narratives. Our results indicate that uni-sensory cross-model integration of word/letter-symbol strings within visual narratives elicit ERP patterns typically observed for written sentence processing, thereby suggesting the engagement of similar domain-independent integration/interpretation mechanisms.

Original languageEnglish
Pages (from-to)28-38
Number of pages11
JournalBrain and Language
Volume169
DOIs
Publication statusPublished - 24 Feb 2017

Keywords

  • onomatopoeia
  • visual language
  • visual narrative
  • Event related potentials

Fingerprint

Dive into the research topics of 'When a hit sounds like a kiss: An electrophysiological exploration of semantic processing in visual narrative'. Together they form a unique fingerprint.

Cite this