Skip to main content
eScholarship
Open Access Publications from the University of California

Calibrating Trust in Autonomous Systems in a Dynamic Environment

Abstract

Appropriately calibrating trust in autonomous systems is es-sential for successful collaboration between humans and thesystems. Over-trust and under-trust often happen in dynami-cally changing environments, and they can be major causes ofserious issues with safety and efficiency. Many studies haveexamined the role of continuous system transparency in keep-ing proper trust calibration; however, not many studies havefocused on how to find poor trust calibration nor how to miti-gate it. In our proposed method of trust calibration, a behavior-based approach is used to detect improper trust calibration, andcognitive cues called “trust calibration cues” are presented tousers as triggers for trust calibration. We conducted an on-line experiment with a drone simulator. Seventy participantsperformed pothole inspection tasks manually or relied on thedrone’s automatic inspection. The results demonstrated thatadaptively presenting a simple cue could significantly promotetrust calibration in both over-trust and under-trust cases.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View