Applied AI Letters (Dec 2021)

Explaining autonomous drones: An XAI journey

  • Mark Stefik,
  • Michael Youngblood,
  • Peter Pirolli,
  • Christian Lebiere,
  • Robert Thomson,
  • Robert Price,
  • Lester D. Nelson,
  • Robert Krivacic,
  • Jacob Le,
  • Konstantinos Mitsopoulos,
  • Sterling Somers,
  • Joel Schooler

DOI
https://doi.org/10.1002/ail2.54
Journal volume & issue
Vol. 2, no. 4
pp. n/a – n/a

Abstract

Read online

Abstract COGLE (COmmon Ground Learning and Explanation) is an explainable artificial intelligence (XAI) system where autonomous drones deliver supplies to field units in mountainous areas. The mission risks vary with topography, flight decisions, and mission goals. The missions engage a human plus AI team where users determine which of two AI‐controlled drones is better for each mission. This article reports on the technical approach and findings of the project and reflects on challenges that complex combinatorial problems present for users, machine learning, user studies, and the context of use for XAI systems. COGLE creates explanations in multiple modalities. Narrative “What” explanations compare what each drone does on a mission and “Why” based on drone competencies determined from experiments using counterfactuals. Visual “Where” explanations highlight risks on maps to help users to interpret flight plans. One branch of the research studied whether the explanations helped users to predict drone performance. In this branch, a model induction user study showed that post‐decision explanations had only a small effect in teaching users to determine by themselves which drone is better for a mission. Subsequent reflection suggests that supporting human plus AI decision making with pre‐decision explanations is a better context for benefiting from explanations on combinatorial tasks.

Keywords