Abstract
When we comprehend language, we often do this in rich settings where we can use many cues to understand what someone is saying. However, it has traditionally been difficult to design experiments with rich three-dimensional contexts that resemble our everyday environments, while maintaining control over the linguistic and nonlinguistic information that is available. Here we test the validity of combining electroencephalography (EEG) and virtual reality (VR) to overcome this problem. We recorded electrophysiological brain activity during language processing in a well-controlled three-dimensional virtual audiovisual environment. Participants were immersed in a virtual restaurant while wearing EEG equipment. In the restaurant, participants encountered virtual restaurant guests. Each guest was seated at a separate table with an object on it (e.g., a plate with salmon). The restaurant guest would then produce a sentence (e.g., "I just ordered this salmon."). The noun in the spoken sentence could either match ("salmon") or mismatch ("pasta") the object on the table, creating a situation in which the auditory information was either appropriate or inappropriate in the visual context. We observed a reliable N400 effect as a consequence of the mismatch. This finding validates the combined use of VR and EEG as a tool to study the neurophysiological mechanisms of everyday language comprehension in rich, ecologically valid settings.
Original language | English |
---|---|
Pages (from-to) | 862-869 |
Number of pages | 8 |
Journal | Behavior Research Methods |
Volume | 50 |
Issue number | 2 |
DOIs | |
Publication status | Published - Apr 2018 |
Externally published | Yes |
Keywords
- Language comprehension
- Language processing
- EEG
- Virtual reality
- N400
- SEMANTIC INTEGRATION
- WORLD EVENTS
- SPEECH
- COMPREHENSION
- ERP
- MODEL