Skip to main content
eScholarship
Open Access Publications from the University of California

Individual differences in explanation strategies for image classification and implications for explainable AI

Creative Commons 'BY' version 4.0 license
Abstract

While saliency-based explainable AI (XAI) methods have been well developed for image classification models, they fall short in comparison with human explanations. Here we examined human explanation strategies for image classification and their relationship with explanation quality to inform better XAI designs. We found that individuals differed in attention strategies during explanation: Participants adopting more explorative strategies used more visual information in their explanations, whereas those adopting more focused strategies included more conceptual information. In addition, visual explanations were rated higher for effectiveness in teaching learners without prior category knowledge, whereas conceptual explanations were more diagnostic for observers with prior knowledge to infer the class label. Thus, individuals differ in the use of visual and conceptual information to explain image classification, which facilitate different aspects of explanation quality and suit learners with different experiences. These findings have important implications for adaptive use of visual and conceptual information in XAI development.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View