Designing an AI-Companion to Support the Driver in Highly Autonomous Cars

Published in HCII 2020: Human-Computer Interaction. Multimodal and Natural Interaction., 2020

Recommended citation: de Salis E. et al. (2020) Designing an AI-Companion to Support the Driver in Highly Autonomous Cars. In: Kurosu M. (eds) Human-Computer Interaction. Multimodal and Natural Interaction. HCII 2020. Lecture Notes in Computer Science, vol 12182. Springer, Cham.

Abstract :

In this paper, we propose a model for an AI-Companion for conditionally automated cars, able to maintain awareness of the driver regarding the environment but also to able design take-over requests (TOR) on the fly, with the goal of better support the driver in case of a disengagement. Our AI-Companion would interact with the driver in two ways: first, it could provide feedback to the driver in order to raise the driver Situation Awareness (SA), prevent them to get out of the supervision loop and so, improve takeover during critical situations by decreasing their cognitive workload. Second, in the case of TOR with a smart choice of modalities for convey the request to the driver. In particular, the AI-Companion can interact with the driver using many modalities, such as visual messages (warning lights, images, text, etc.), auditory signals (sound, speech, etc.) and haptic technologies (vibrations in different parts of the seat: back, headrest, etc.). The ultimate goal of the proposed approach is to design smart HMIs in semi-autonomous vehicles that are able to understand 1) the user state and fitness to drive, 2) the current external situation (vehicle status and behavior) in order to minimize the automation surprise and maximizing safety and trust, and 3) leverage AI to provide adaptive TOR and useful feedback to the driver.

Download paper here