Exploring the Effect of Music on User Typing and Identification through Keystroke DynamicsThis paper explores the relationship between music and keyboard typing behavior. In particular, we focus on how it affects keystroke-based authentication systems. To this end, we conducted an online experiment (N=43), where participants were asked to replicate paragraphs of text while listening to music at varying tempos and loudness levels across two sessions. Our findings reveal that listening to music leads to more errors and faster typing if the music is fast. Identification through a biometric model was improved when music was played either during its training or testing. This hints at the potential of music for increasing identification performance and a tradeoff between this benefit and user distraction. Overall, our research sheds light on typing behavior and introduces music as a subtle and effective tool to influence user typing behavior in the context of keystroke-based authentication.2025LMLukas Mecke et al.LMU Munich; University of the Bundeswehr MunichVibrotactile Feedback & Skin StimulationExplainable AI (XAI)Passwords & AuthenticationCHI
It’s Not Always the Same Eye That Dominates: Effects of Viewing Angle, Handedness and Eye Movement in 3DUnderstanding eye dominance, the subconscious preference for one eye, has significant implications for 3D user interfaces in VR and AR, particularly in interface design and rendering. Although HCI recognizes eye dominance, little is known about what causes it to switch from one eye to another. To explore this, we studied eye dominance in VR, where 28 participants manually aligned a cursor with a distant target across three tasks. We manipulated the horizontal viewing angle, the hand used for alignment, and eye movement induced by target behaviour. Our results confirm the dynamic nature of eye dominance, though with fewer switches than expected and varying influences across tasks. This highlights the need for adaptive HCI techniques, which account for shifts in eye dominance in system design, such as gaze-based interaction, visual design, or rendering, and can improve accuracy, usability, and experience.2025FPFranziska Prummer et al.Lancaster University, School of Computing and CommunicationsEye Tracking & Gaze InteractionImmersion & Presence ResearchCHI
SalChartQA: Question-driven Saliency on Information VisualisationsUnderstanding the link between visual attention and users' information needs when visually exploring information visualisations is under-explored due to a lack of large and diverse datasets to facilitate these analyses. To fill this gap we introduce SalChartQA -- a novel crowd-sourced dataset that uses the BubbleView interface to track user attention and a question-answering (QA) paradigm to induce different information needs in users. SalChartQA contains 74,340 answers to 6,000 questions on 3,000 visualisations. Informed by our analyses demonstrating the close correlation between information needs and visual saliency, we propose the first computational method to predict question-driven saliency on visualisations. Our method outperforms state-of-the-art saliency models for several metrics, such as the correlation coefficient and the Kullback-Leibler divergence. These results show the importance of information needs for shaping attentive behaviour and pave the way for new applications, such as task-driven optimisation of visualisations or explainable AI in chart question-answering.2024YWYao Wang et al.University of StuttgartExplainable AI (XAI)Interactive Data VisualizationVisualization Perception & CognitionCHI
"Hello I am here'': Proximal Nonverbal Cues Role in Initiating Social Interactions in VRVirtual Reality (VR) has revolutionized social interactions, but limited field of view (FoV) remains a significant obstacle. Users often fail to notice others within the virtual environment, hindering social engagement. To facilitate initiating social interactions, we developed a novel social signaling technique that utilizes proximal nonverbal cues to indicate users' location, name, and interests within a social distance. In a 2 x 2 mixed user study, we found that this technique greatly enhanced social presence and interaction quality among users with prior social ties. Our signaling technique has tremendous potential to facilitate social interactions across various social virtual events, such as staff meetings and reunions.2023AYAmal Yassien et al.Social & Collaborative VRImmersion & Presence ResearchUbiComp
"Your Eyes Say You Have Used This Password Before": Identifying Password Reuse from Gaze Behavior and Keystroke DynamicsA significant drawback of text passwords for end-user authentication is password reuse. We propose a novel approach to detect password reuse by leveraging gaze as well as typing behavior and study its accuracy. We collected gaze and typing behavior from 49 users while creating accounts for 1) a webmail client and 2) a news website. While most participants came up with a new password, 32% reported having reused an old password when setting up their accounts. We then compared different ML models to detect password reuse from the collected data. Our models achieve an accuracy of up to 87.7% in detecting password reuse from gaze, 75.8% accuracy from typing, and 88.75% when considering both types of behavior. We demonstrate that \revised{using gaze, password} reuse can already be detected during the registration process, before users entered their password. Our work paves the road for developing novel interventions to prevent password reuse.2022YAYasmeen Abdrabou et al.Bundeswehr University Munich, University of GlasgowEye Tracking & Gaze InteractionPasswords & AuthenticationCHI
PriView -- Exploring Visualisations Supporting Users' Privacy AwarenessWe present PriView, a concept that allows privacy-invasive devices in the users’ vicinity to be visualised. PriView is motivated by an ever-increasing number of sensors in our environments tracking potentially sensitive data (e.g., audio and video). At the same time, users are oftentimes unaware of this, which violates their privacy. Knowledge about potential recording would enable users to avoid accessing such areas or not to disclose certain information. We built two prototypes: a) a mobile application capable of detecting smart devices in the environment using a thermal camera, and b) VR mockups of six scenarios where PriView might be useful (e.g., a rental apartment). In both, we included several types of visualisation. Results of our lab study (N=24) indicate that users prefer simple, permanent indicators while wishing for detailed visualisations on demand. Our exploration is meant to support future designs of privacy visualisations for varying smart environments.2021SPSarah Prange et al.Bundeswehr University Munich, LMU MunichPrivacy by Design & User ControlPrivacy Perception & Decision-MakingContext-Aware ComputingCHI
Understanding User Identification in Virtual Reality through Behavioral Biometrics and the Effect of Body NormalizationVirtual Reality (VR) is becoming increasingly popular both in the entertainment and professional domains. Behavioral biometrics have recently been investigated as a means to continuously and implicitly identify users in VR. Applications in VR can specifically benefit from this, for example, to adapt virtual environments and user interfaces as well as to authenticate users. In this work, we conduct a lab study (N=16) to explore how accurately users can be identified during two task-driven scenarios based on their spatial movement. We show that an identification accuracy of up to 90 % is possible across sessions recorded on different days. Moreover, we investigate the role of users' physiology in behavioral biometrics by virtually altering and normalizing their body proportions. We find that body normalization in general increases the identification rate, in some cases by up to 38 %; hence, it improves the performance of identification systems.2021JLJonathan Liebers et al.University of Duisburg-EssenHuman Pose & Activity RecognitionIdentity & Avatars in XRCHI
Therminator: Understanding the Interdependency of Visual and On-Body Thermal Feedback in Virtual RealityRecent advances have made Virtual Reality (VR) more realistic than ever before. This improved realism is attributed to today's ability to increasingly appeal to human sensations, such as visual, auditory or tactile. While research also examines temperature sensation as an important aspect, the interdependency of visual and thermal perception in VR is still underexplored. In this paper, we propose Therminator, a thermal display concept that provides warm and cold on-body feedback in VR through heat conduction of flowing liquids with different temperatures. Further, we systematically evaluate the interdependency of different visual and thermal stimuli on the temperature perception of arm and abdomen with 25 participants. As part of the results, we found varying temperature perception depending on the stimuli, as well as increasing involvement of users during conditions with matching stimuli.2020SGSebastian Günther et al.Technische Universität DarmstadtMid-Air Haptics (Ultrasonic)Immersion & Presence ResearchCHI
Virtual Field Studies: Conducting Studies on Public Displays in Virtual RealityField studies on public displays can be difficult, expensive, and time-consuming. We investigate the feasibility of using virtual reality (VR) as a test-bed to evaluate deployments of public displays. Specifically, we investigate whether results from virtual field studies, conducted in a virtual public space, would match the results from a corresponding real-world setting. We report on two empirical user studies where we compared audience behavior around a virtual public display in the virtual world to audience behavior around a real public display. We found that virtual field studies can be a powerful research tool, as in both studies we observed largely similar behavior between the settings. We discuss the opportunities, challenges, and limitations of using virtual reality to conduct field studies, and provide lessons learned from our work that can help researchers decide whether to employ VR in their research and what factors to account for if doing so.2020VMVille Mäkelä et al.Ludwig Maximilian University of Munich & Tampere UniversitySocial & Collaborative VRField StudiesCHI