Frontiers in Behavioral Economics (Mar 2024)

Humans in XAI: increased reliance in decision-making under uncertainty by using explanation strategies

  • Olesja Lammert,
  • Birte Richter,
  • Birte Richter,
  • Christian Schütze,
  • Christian Schütze,
  • Kirsten Thommes,
  • Britta Wrede,
  • Britta Wrede

DOI
https://doi.org/10.3389/frbhe.2024.1377075
Journal volume & issue
Vol. 3

Abstract

Read online

IntroductionAlthough decision support systems (DSS) that rely on artificial intelligence (AI) increasingly provide explanations to computer and data scientists about opaque features of the decision process, especially when it involves uncertainty, there is still only limited attention to making the process transparent to end users.MethodsThis paper compares four distinct explanation strategies employed by a DSS, represented by the social agent Floka, designed to assist end users in making decisions under uncertainty. Using an economic experiment with 742 participants who make lottery choices according to the Holt and Laury paradigm, we contrast two explanation strategies offering accurate information (transparent vs. guided) with two strategies prioritizing human-centered explanations (emotional vs. authoritarian) and a baseline (no explanation).Results and discussionOur findings indicate that a guided explanation strategy results in higher user reliance than a transparent strategy. Furthermore, our results suggest that user reliance is contingent on the chosen explanation strategy, and, in some instances, the absence of an explanation can also lead to increased user reliance.

Keywords