Semantically related gestures move alike: Towards a distributional semantics of gesture kinematics

Wim Pouw, Jan de Wit, Sara Bögels, Marlou Rasenberg, Branka Milivojevic, Asli

Research output: Contribution to conferencePaperScientificpeer-review

Abstract

Most manual communicative gestures that humans produce cannot be looked up in a dictionary, as these manual gestures inherit their meaning in large part from the communicative context and are not conventionalized. However, it is understudied to what extent the communicative signal as such—bodily postures in movement, or kinematics—can inform about gesture semantics. Can we construct, in principle, a distribution-based semantics of gesture kinematics, similar to how word vectorization methods in NLP (Natural language Processing) are now widely used to study semantic properties in text and speech? For such a project to get off the ground, we need to know the extent to which semantically similar gestures are more likely to be kinematically similar. In study 1 we assess whether semantic word2vec distances between the conveyed concepts participants were explicitly instructed to convey in silent gestures, relate to the kinematic distances of these gestures as obtained from Dynamic Time Warping (DTW). In a second director-matcher dyadic study we assess kinematic similarity between spontaneous co-speech gestures produced between interacting participants. Participants were asked before and after they interacted how they would name the objects. The semantic distances between the resulting names were related to the gesture kinematic distances of gestures that were made in the context of conveying those objects in the interaction. We find that the gestures’ semantic relatedness is reliably predictive of kinematic relatedness across these highly divergent studies, which suggests that the development of an NLP method of deriving semantic relatedness from kinematics is a promising avenue for future developments in automated multimodal recognition. Deeper implications for statistical learning processes in multimodal language are discussed.
Original languageEnglish
Pages269-287
Number of pages19
DOIs
Publication statusPublished - 27 Jan 2021
EventDigital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management: Human Body, Motion and Behavior -
Duration: 24 Jul 202129 Jul 2021
Conference number: 2021

Conference

ConferenceDigital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management
Abbreviated titleHCII
Period24/07/2129/07/21

Keywords

  • Manual gesture kinematics
  • NLP
  • Speech
  • Semantics
  • Time series comparison

Fingerprint

Dive into the research topics of 'Semantically related gestures move alike: Towards a distributional semantics of gesture kinematics'. Together they form a unique fingerprint.

Cite this