Slow response times undermine trust in algorithmic (but not human) predictions

Emir Efendić*, P.P.F.M. van de Calseyde, Anthony Evans

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

44 Citations (Scopus)
258 Downloads (Pure)

Abstract

Algorithms consistently perform well on various prediction tasks, but people often mistrust their advice. Here, we demonstrate one component that affects people’s trust in algorithmic predictions: response time. In seven studies (total N = 1928 with 14,184 observations), we find that people judge slowly generated predictions from algorithms as less accurate and they are less willing to rely on them. This effect reverses for human predictions, where slowly generated predictions are judged to be more accurate. In explaining this asymmetry, we find that slower response times signal the exertion of effort for both humans and algorithms. However, the relationship between perceived effort and prediction quality differs for humans and algorithms. For humans, prediction tasks are seen as difficult and observing effort is therefore positively correlated with the perceived quality of predictions. For algorithms, however, prediction tasks are seen as easy and effort is therefore uncorrelated to the quality of algorithmic predictions. These results underscore the complex processes and dynamics underlying people’s trust in algorithmic (and human) predictions and the cues that people use to evaluate their quality.
Original languageEnglish
Pages (from-to)103-114
JournalOrganizational Behavior and Human Decision Processes
Volume157
DOIs
Publication statusPublished - 2020

Keywords

  • Algorithm aversion
  • DECISION TIME
  • Human-computer interaction
  • JUDGMENT
  • Judgment and decision making
  • PEOPLE
  • Prediction
  • Response time

Fingerprint

Dive into the research topics of 'Slow response times undermine trust in algorithmic (but not human) predictions'. Together they form a unique fingerprint.

Cite this