The ghost in the machine: Emotionally intelligent conversational agents and the failure to regulate ‘deception by design’

Pauline Kuss*, Ronald Leenes

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

Abstract

Google’s Duplex illustrates the great strides made in AI to provide synthetic agents the capabilities to intuitive and seemingly natural human- machine interaction, fostering a growing acceptance of AI systems as social actors. Following BJ Fogg’s captology framework, we analyse the persuasive and potentially manipulative power of emotionally intelligent conversational agents (EICAs). By definition, human-sounding conversational agents are ‘designed to deceive’. They do so on the basis of vast amounts of information about the individual they are interacting with. We argue that although the current data protection and privacy framework in the EU offers some protection against manipulative conversational agents, the real upcoming issues are not acknowledged in regulation yet.
Original languageEnglish
Pages (from-to)320-358
Number of pages39
JournalSCRIPTed
Volume17
Issue number2
DOIs
Publication statusPublished - 6 Aug 2020

Keywords

  • Google Duplex
  • conversational agent
  • persuasion
  • manipulation
  • regulatory failure

Fingerprint

Dive into the research topics of 'The ghost in the machine: Emotionally intelligent conversational agents and the failure to regulate ‘deception by design’'. Together they form a unique fingerprint.

Cite this