The experimental paradigm of embodiment illusions has greatly contributed to our knowledge on how the brain distinguishes self from other. Different types of illusions have provided empirical data for a theoretical framework describing multisensory integration of signals from both within and outside of the body, and predictive coding mechanisms to weigh those signals. However, most embodiment illusion studies lean heavily on visual sensory information as the main signal to establish the illusion. Few studies to date have explored non-visual embodiment illusions. These could potentially lead to a more thorough understanding of the underlying mechanisms of embodiment. In this study, we aim to approach an auditory embodiment illusion— more specifically a voice illusion — in an in-depth, structured way. We combined vibrotactile feedback on the throat with voice sounds, both articulated and non-articulated. Additionally,we measured interoceptive, proprioceptive and exteroceptive sensitivity. Results indicate that non-visual embodiment illusions are much more difficult to establish than visual-based ones and that proprioceptive and interoceptive sensitivity might influence the illusion strength. Absence of feedback might disrupt the illusion less than asynchronous feedback, which is in line with predictive coding expectations.
|Title of host publication||2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII)|
|Publication status||Published - Sep 2019|
|Event||8th International Conference on Affective Computing and Intelligent Interaction - Cambridge, United Kingdom|
Duration: 3 Sep 2019 → 6 Sep 2019
|Conference||8th International Conference on Affective Computing and Intelligent Interaction|
|Period||3/09/19 → 6/09/19|