Degree of Language Experience Modulates Visual Attention to Visible Speech and Iconic Gestures During Clear and Degraded Speech Comprehension

Linda Drijvers*, Julija Vaitonyte, Asli Ozyurek

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

Abstract

Visual information conveyed by iconic hand gestures and visible speech can enhance speech comprehension under adverse listening conditions for both native and non-native listeners. However, how a listener allocates visual attention to these articulators during speech comprehension is unknown. We used eye-tracking to investigate whether and how native and highly proficient non-native listeners of Dutch allocated overt eye gaze to visible speech and gestures during clear and degraded speech comprehension. Participants watched video clips of an actress uttering a clear or degraded (6-band noise-vocoded) action verb while performing a gesture or not, and were asked to indicate the word they heard in a cued-recall task. Gestural enhancement was the largest (i.e., a relative reduction in reaction time cost) when speech was degraded for all listeners, but it was stronger for native listeners. Both native and non-native listeners mostly gazed at the face during comprehension, but non-native listeners gazed more often at gestures than native listeners. However, only native but not non-native listeners' gaze allocation to gestures predicted gestural benefit during degraded speech comprehension. We conclude that non-native listeners might gaze at gesture more as it might be more challenging for non-native listeners to resolve the degraded auditory cues and couple those cues to phonological information that is conveyed by visible speech. This diminished phonological knowledge might hinder the use of semantic information that is conveyed by gestures for non-native compared to native listeners. Our results demonstrate that the degree of language experience impacts overt visual attention to visual articulators, resulting in different visual benefits for native versus non-native listeners.

Original languageEnglish
Article number12789
Number of pages25
JournalCognitive Science
Volume43
Issue number10
DOIs
Publication statusPublished - Oct 2019
Externally publishedYes

Keywords

  • Speech comprehension
  • Gesture
  • Semantic integration
  • Degraded speech
  • Non-native
  • Eye-tracking
  • Visual attention
  • Multimodal
  • SIGN-LANGUAGE
  • NONNATIVE LISTENERS
  • BACKGROUND-NOISE
  • NATIVE LANGUAGE
  • PERCEPTION
  • 2ND-LANGUAGE
  • RECOGNITION
  • INFORMATION
  • ACQUISITION
  • BRAIN

Cite this