IEEE Access (Jan 2024)

POTDAI: A Tool to Evaluate the Perceived Operational Trust Degree in Artificial Intelligence Systems

  • David Martin-Moncunill,
  • Eduardo Garcia Laredo,
  • Juan Carlos Nieves

DOI
https://doi.org/10.1109/ACCESS.2024.3454061
Journal volume & issue
Vol. 12
pp. 133097 – 133109

Abstract

Read online

There is evidence that a user’s subjective confidence in an Artificial Intelligence (AI)-based system is crucial in its use, even more decisive than the objective effectiveness and efficiency of the system. Therefore, different methods have been proposed for analyzing confidence in AI. In our research, we set out to evaluate how the degree of perceived trust in an AI system could affect a user’s final decision to follow AI recommendations. To this end, we established trustworthy criteria that such an evaluation should meet by following a co-creation approach with a multidisciplinary group of 10 experts. After a systematic review of 3,204 articles, we found that none of the tools met the inclusion criteria. Thus, we introduce the so-called “Perceived Operational Trust Degree in AI” (POTDAI) tool that is based on the findings from the expert group and the literature analysis, with a methodology that adds rigor to that employed previously to create similar evaluation tools. We propose a short questionnaire for quick and easy application, inspired by the original version of the Technology Acceptance Model (TAM) with six Likert-type items. In this way, we also respond to the need pointed out by authors such as Vorm and Combs to extend the TAM to address questions related to user perception in systems with an AI component. Thus, POTDAI can be used alone or in combination with TAM to obtain additional information on its usefulness and ease of use.

Keywords