Imitation of human motion achieves natural head movements for humanoid robots in an active-speaker detection task

Research output: Contribution to conferencePaperScientificpeer-review

1 Downloads (Pure)

Abstract

Head movements are crucial for social human-human interaction. They can transmit important cues (e.g., joint attention, speaker detection) that cannot be achieved with verbal interaction alone. This advantage also holds for human-robot interaction. Even though modeling human motions through generative AI models has become an active research area within robotics in recent years, the use of these methods for producing head movements in human-robot interaction remains underexplored. In this work, we employed a generative AI pipeline to produce human-like head movements for a Nao humanoid robot. In addition, we tested the system on a real-time active-speaker tracking task in a group conversation setting. Overall, the results show that the Nao robot successfully imitates human head movements in a natural manner while actively tracking the speakers during the conversation. Code and data from this study are available at https://github.com/dingdingding60/Humanoids2024HRI
Original languageEnglish
Pages645-652
Number of pages8
DOIs
Publication statusPublished - 16 Jul 2024
EventIEEE-RAS 23rd International Conference on Humanoid Robots - France, Nancy, France
Duration: 22 Nov 202424 Nov 2024
Conference number: 23

Conference

ConferenceIEEE-RAS 23rd International Conference on Humanoid Robots
Abbreviated titleHumanoids 2024
Country/TerritoryFrance
CityNancy
Period22/11/2424/11/24

Keywords

  • cs.RO
  • cs.AI
  • cs.HC
  • cs.LG

Fingerprint

Dive into the research topics of 'Imitation of human motion achieves natural head movements for humanoid robots in an active-speaker detection task'. Together they form a unique fingerprint.

Cite this