How Do Different Feedback Modalities Affect Drivers' Attention and Task Performance When Interacting with In-Vehicle Infotainment SystemsInteracting with in-vehicle infotainment screens (IVIS) while driving influences how drivers allocate attention to the road, which can affect task and driving proficiency. Haptic feedback in IVIs has emerged as a promising avenue to support drivers’ interactions. Little is known about the impact of diverse feedback modalities on different types of interactions commonly carried out when interacting with IVIS. Two studies were conducted utilising production vehicles to understand the effects of IVIS featuring haptic feedback modalities (vibration and force touch) compared to other modalities (audio-only and no haptic feedback) for different types of interactions. Findings indicate that vibration and audio-only feedback supported drivers’ attention to the road and increased task performance in most types of interactions. Conversely, force touch did not support the driver and yielded the same results as having no haptic feedback. These results can support the design and enhancement of IVIS systems to support the drivers.2024PGPablo Puente Guillen et al.In-Vehicle Haptic, Audio & Multimodal FeedbackVibrotactile Feedback & Skin StimulationForce Feedback & Pseudo-Haptic WeightAutoUI
Steering Away from Automotive HMI De Facto Standards – The Effect on Control DiscoverabilityAutomotive manufacturers are continually looking for a competitive edge which may include implementing novel user interface solutions. This research aimed to understand how novel location and control type impacts on the discoverability of vehicle functions. Participants were asked to complete visual search tasks in two virtual vehicle interiors (wearing a virtual reality headset) where control types and their locations either met the de facto industry standard or did not. The tasks were repeated four times for each interior. Results showed that the mean search time and number of errors were higher when the control type or location did not align with the de facto standard. Furthermore, a novel location was found to have a greater impact on search time. However, the number of errors was greater when searching for a novel control type. The findings suggest moving away from de facto standards should be carefully considered by automotive manufacturers.2024DRDuncan A Robertson et al.Automated Driving Interface & Takeover DesignHead-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)AutoUI
To Please in a Pod: Employing an Anthropomorphic Agent-Interlocutor to Enhance Trust and User Experience in an Autonomous, Self-Driving VehicleRecognising that one of the aims of conversation is to build, maintain and strengthen positive relationships with others, the study explores whether passengers in an autonomous vehicle display similar behaviour during transactions with an on-board conversational agent-interface; moreover, whether related attributes (e.g. trust) transcend to the vehicle itself. Employing a counterbalanced, within-subjects design, thirty-four participants were transported in a self-driving pod using an expansive testing arena. Participants undertook three journeys with an anthropomorphic agent-interlocutor (via Wizard-of-Oz), a voice-command interface, or a traditional touch-surface; each delivered equivalent task-related information. Results show that the agent-interlocutor was the most preferred interface, attracting the highest ratings of trust, and significantly enhancing the pleasure and sense of control over the journey experience, despite the inclusion of ‘trust challenges’ as part of the design. The findings can help support the design and development of in-vehicle agent-based voice interfaces to enhance trust and user experience in autonomous cars.2019DLDavid R. Large et al.Voice User Interface (VUI) DesignAgent Personality & AnthropomorphismAutoUI
Fitts Goes Autobahn: Assessing the Visual Demand of Finger-Touch Pointing Tasks in an On-Road StudyThe visual demand of finger-touch based interactions with touch screens has been increasingly modelled using Fitts’ Law. With respect to driving, these models facilitate the prediction of mean glance duration and total glance time with an index of difficulty based on target size and location. Strong relationships between measures have been found in the controlled conditions of driving simulators. The present study aimed to validate such models in naturalistic conditions. Nineteen experienced drivers carried out a range of touchscreen button-press tasks in an instrumented car on a UK motorway. In contrast with previous simulator-based work, our on-road data produced much weaker relationships between the index of difficulty and glance times. The model improved by focusing on tasks that required one glance only. Limitations of Fitts’ Law in the more complex and dynamic real-world driving environment are discussed, as are the potential drawbacks of driving simulators for conducting visual demand research.2019SPSanna M Pampel et al.Automated Driving Interface & Takeover DesignHead-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)AutoUI
Haptic Navigation Cues on the Steering WheelHaptic feedback is used in cars to reduce visual inattention. While tactile feedback like vibration can be influenced by the car's movement, thermal and cutaneous push feedback should be independent of such interference. This paper presents two driving simulator studies investigating novel tactile feedback on the steering wheel for navigation. First, devices on one side of the steering wheel were warmed, indicating the turning direction, while those on the other side were cooled. This thermal feedback was compared to audio. The thermal navigation lead to 94.2% correct recognitions of warnings 200m before the turn and to 91.7% correct turns. Speech had perfect recognition for both. In the second experiment, only the destination side was indicated thermally, and this design was compared to cutaneous push feedback. The simplified thermal feedback design did not increase recognition, but cutaneous push feedback had high recognition rates (100% for 200 m warnings, 98% for turns).2019PVPatrizia Di Campli San Vito et al.University of GlasgowIn-Vehicle Haptic, Audio & Multimodal FeedbackVibrotactile Feedback & Skin StimulationCHI
P7 - Evaluating How Interfaces Influence the User Interaction with Fully Autonomous VehiclesWith increasing automation, occupants of fully autonomous vehicles are likely to be completely disengaged from the driving task. However, even with no driving involved, there are still activities that will require interfaces between the vehicle and passengers. This study evaluated different configurations of screens providing operational-related information to occupants for tracking the progress of journeys. Surveys and interviews were used to measure trust, usability, workload and experience after users were driven by an autonomous low speed pod. Results showed that participants want to monitor the state of the vehicle and see details about the ride, including a map of the route and related information. There was a preference for this information to be displayed via an onboard touchscreen device combined with an overhead letterbox display versus a smartphone-based interface. This paper provides recommendations for the design of devices with the potential to improve the user interaction with future autonomous vehicles.2018LOLuis Oliveira et al.Automated Driving Interface & Takeover DesignMotion Sickness & Passenger ExperienceAutoUI
Establishing the Role of a Virtual Lead Vehicle as a Novel Augmented Reality Navigational AidThis paper reports on two studies investigating how following a lead vehicle could act as a metaphor for an Augmented Reality (AR) system to support navigation tasks. For the first formative study, 34 participants completed a video-based evaluation of the role of a real lead vehicle when navigating a coherent journey. Verbal protocols indicated that a lead vehicle may be a valuable navigation aid at a range of different junction types, but not where drivers may desire a preview of upcoming steps or their overall orientation. A subsequent driving simulator study with 22 participants examined whether an AR lead vehicle may support drivers when navigating at complex junctions, specifically large multi-exit roundabouts. The virtual car led to good navigation and driving performance, which was comparable to a more traditional screen-fixed interface. Overall, this work demonstrates that a virtual lead vehicle may be beneficial within AR navigation devices.2018BTBethan Hannah Topliss et al.Head-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)AR Navigation & Context AwarenessAutoUI
Investigation of Thermal Stimuli for Lane ChangesHaptic feedback has been widely studied for in-car interactions. However, most of this research has used vibrotactile cues. This paper presents two studies that examine novel thermal feedback for navigation during simulated driving for a lane change task. In the first, we compare the distraction and time differences of audio and thermal feedback. The results show that the presentation of thermal stimuli does not increase lane deviation, but the time needed to complete a lane change increased by 1.82 seconds. In the second study, the influence of variable changes of thermal stimuli on the lane change task performance was tested. We found that the same stimulus design for warm and cold temperatures does not always elicit the same results. Furthermore, variable alterations can have different effects on specified tasks. This suggests that the design of thermal stimuli is highly dependent on what task result should be maximized.2018PVPatrizia Di Campli San Vito et al.In-Vehicle Haptic, Audio & Multimodal FeedbackVibrotactile Feedback & Skin StimulationAutoUI
Selection Facilitation Schemes for Predictive Touch with Mid-air Pointing Gestures in Automotive DisplaysPredictive touch is an HMI technology that relies on inferring, early in the pointing gesture, the interface item a driver or passenger intends to select on an in-vehicle display [1, 2]. It simplifies and expedites the selection task, thereby reducing the associated interaction effort. This paper presents two studies on drivers using predictive touch and focuses on evaluating the best means to facilitate selecting the intended on-display item. This includes immediate mid-air selection with the system autonomously auto-selecting the predicted interface component, hover/dwell and drivers pressing a button on the steering wheel to execute the selection action. These were arrived at in an expert workshop study with twelve participants. The results of the subsequent evaluation study with twenty four participants demonstrate, using quantitative and qualitative measures, that immediate mid-air selection is a promising assistive scheme, where drivers need not touch a physical surface to select interface components, thus touch-free control.2018BABashar I. Ahmad et al.Head-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)Hand Gesture RecognitionAutoUI
An Evaluation of Inclusive Dialogue-Based Interfaces for the Takeover of Control in Autonomous CarsThis paper presents an evaluation of dialogue-based interfaces, which mediate the driver to take back vehicle control from the autonomous mode of a car. Four concepts designed to increase driver Situation Awareness were evaluated in a driving simulator. They used dialogue-based interaction, where driving-related information was either asked from or repeated by the driver, with the alternative of a countdown timer interface with no additional information. An inclusive set of participants, with a wide age spectrum, tested the interfaces with their performance and views recorded. The shorter and simpler interaction of the countdown timer was most accepted. The interface seeking answers to driving-related questions came next, and the interface requiring repetition of driving-related information, even when augmented by visual and tactile cues, was least accepted. Design guidelines on utilizing dialogue as an effective means of keeping the driver in the loop when taking control from an autonomous vehicle were thus derived.2018IPIoannis Politis et al.Automated Driving Interface & Takeover DesignVoice User Interface (VUI) DesignIUI