Multimodal processing of emotional information in 9-month-old infants II: Prenatal exposure to maternal anxiety

R.A. Otte, F.C.L. Donkers, M.A.K.A. Braeken, B.R.H. Van den Bergh

Research output: Contribution to journalArticleScientificpeer-review

25 Citations (Scopus)

Abstract

The ability to read emotional expressions from human face and voice is an important skill in our day-to-day interactions with others. How this ability develops may be influenced by atypical experiences early in life. Here, we investigated multimodal processing of fearful and happy face/voice pairs in 9-month-olds prenatally exposed to maternal anxiety, using event-related potentials (ERPs). Infants were presented with emotional vocalisations (happy/fearful) preceded by emotional facial expressions (happy/fearful). The results revealed larger P350 amplitudes in response to fearful vocalisations when infants had been exposed to higher levels of anxiety, regardless of the type of visual prime, which may indicate increased attention to fearful vocalisations. A trend for a positive association between P150 amplitudes and maternal anxiety scores during pregnancy may suggest these infants are more easily aroused by and extract features more thoroughly from fearful vocalisations as well. These findings are compatible with the hypothesis that prenatal exposure to maternal anxiety is related to more extensive processing of fear-related stimuli.
Original languageEnglish
Pages (from-to)107-117
JournalBrain and Cognition
Volume95
DOIs
Publication statusPublished - 2015

Keywords

  • Developmental origins of health and disease (DOBHAD)
  • Anxiety
  • Infant
  • Event-related potential (ERP)
  • Multimodal processing
  • Emotion perception

Fingerprint

Dive into the research topics of 'Multimodal processing of emotional information in 9-month-old infants II: Prenatal exposure to maternal anxiety'. Together they form a unique fingerprint.

Cite this