Abstract
Human interlocutors automatically adapt verbal and non-verbal signals so that different behaviors become synchronized over time. Multimodal communication comes naturally to humans, while this is not the case for Embodied Conversational Agents (ECAs). Knowing which behavioral channels synchronize within and across speakers and how they align seems critical in the development of ECAs. Yet, there exists little data-driven research that provides guidelines for the synchronization of different channels within an interlocutor. This study focuses on intrapersonal dependencies of multimodal behavior by using cross-recurrence analysis on a multimodal communication dataset to better understand the temporal relationships between language and gestural behavior channels. By shedding light on the intrapersonal synchronization of communicative channels in humans, we provide an initial manual for modality synchro-nisation in ECAs. CCS CONCEPTS · Human-centered computing → Empirical studies in HCI; · Computing methodologies → Discourse, dialogue and prag-matics.
Original language | English |
---|---|
Title of host publication | Intrapersonal dependencies in multimodal behavior |
Publisher | Association for Computing Machinery |
Pages | 1-8 |
Number of pages | 8 |
ISBN (Print) | 9781450375863 |
DOIs | |
Publication status | Published - 20 Oct 2020 |
Event | IVA '20: ACM International Conference on Intelligent Virtual Agents - University of Glasgow, Glasgow, United Kingdom Duration: 19 Oct 2020 → 23 Jan 2021 Conference number: 20 https://dl.acm.org/conference/iva |
Conference
Conference | IVA '20 |
---|---|
Abbreviated title | IVA '20 |
Country/Territory | United Kingdom |
City | Glasgow |
Period | 19/10/20 → 23/01/21 |
Internet address |