The ability to read emotional expressions from human face and voice is an important skill in our day-to-day interactions with others. How this ability develops may be influenced by atypical experiences early in life. Here, we investigated multimodal processing of fearful and happy face/voice pairs in 9-month-olds prenatally exposed to maternal anxiety, using event-related potentials (ERPs). Infants were presented with emotional vocalisations (happy/fearful) preceded by emotional facial expressions (happy/fearful). The results revealed larger P350 amplitudes in response to fearful vocalisations when infants had been exposed to higher levels of anxiety, regardless of the type of visual prime, which may indicate increased attention to fearful vocalisations. A trend for a positive association between P150 amplitudes and maternal anxiety scores during pregnancy may suggest these infants are more easily aroused by and extract features more thoroughly from fearful vocalisations as well. These findings are compatible with the hypothesis that prenatal exposure to maternal anxiety is related to more extensive processing of fear-related stimuli.
- Developmental origins of health and disease (DOBHAD)
- Event-related potential (ERP)
- Multimodal processing
- Emotion perception