Situated Artifacts Amplify Engagement in Physical ActivityIn the context of rising sedentary lifestyles, this paper investigates the efficacy of "Situated Artifacts" in promoting physical activity. We designed two artifacts that display users' physical activity data within their homes - one physical and one digital. We conducted a 9-week, counterbalanced, within-subject field study with N=24 participants to assess the impact of these artifacts on physical activity, reflection, and motivation. We collected quantitative data on physical activity and administered daily and weekly questionnaires, employing individual Likert items and standardized instruments, as well as conducted interviews post-prototype usage. Our findings indicate that while both artifacts act as reminders for physical activity, the physical artifact was superior in terms of user engagement. The study revealed that this can be attributed to the higher perceived presence and, thereby, enhanced social interaction, which acts as a motivational source for activity. In this sense, situated artifacts gently nudge toward sustainable health behavior change.2025JKJonas Keppel et al.Fitness Tracking & Physical Activity MonitoringSleep & Stress MonitoringDIS
Spatial Haptics: A Sensory Substitution Method for Distal Object Detection Using Tactile CuesWe present a sensory substitution-based method for representing locations of remote objects in 3D space via haptics. By imitating auditory localization processes, we enable vibrotactile localization abilities similar to those of some spiders, elephants, and other species. We evaluated this concept in virtual reality by modulating the vibration amplitude of two controllers depending on relative locations to a target. We developed two implementations applying this method using either ear or hand locations. A proof-of-concept study assessed localization performance and user experience, achieving under 30° differentiation between horizontal targets with no prior training. This unique approach enables localization by using only two actuators, requires low computational power, and could potentially assist users in gaining spatial awareness in challenging environments. We compare the implementations and discuss the use of hands as ears in motion, a novel technique not previously explored in the sensory substitution literature.2025IWIddo Yehoshua Wald et al.University of Bremen, Digital Media LabVibrotactile Feedback & Skin StimulationFull-Body Interaction & Embodied InputCHI
Understanding User Acceptance of Electrical Muscle Stimulation in Human-Computer InteractionElectrical Muscle Stimulation (EMS) has unique capabilities that can manipulate users' actions or perceptions, such as actuating user movement while walking, changing the perceived texture of food, and guiding movements for a user learning an instrument. These applications highlight the potential utility of EMS, but such benefits may be lost if users reject EMS. To investigate user acceptance of EMS, we conducted an online survey (N=101). We compared eight scenarios, six from HCI research applications and two from the sports and health domain. To gain further insights, we conducted in-depth interviews with a subset of the survey respondents (N=10). The results point to the challenges and potential of EMS regarding social and technological acceptance, showing that there is greater acceptance of applications that manipulate action than those that manipulate perception. The interviews revealed safety concerns and user expectations for the design and functionality of future EMS applications.2024SFSarah Faltaous et al.University Duisburg-EssenElectrical Muscle Stimulation (EMS)CHI
Kinetic Signatures: A Systematic Investigation of Movement-Based User Identification in Virtual RealityBehavioral Biometrics in Virtual Reality (VR) enable implicit user identification by leveraging the motion data of users' heads and hands from their interactions in VR. This spatiotemporal data forms a Kinetic Signature, which is a user-dependent behavioral biometric trait. Although kinetic signatures have been widely used in recent research, the factors contributing to their degree of identifiability remain mostly unexplored. Drawing from existing literature, this work systematically examines the influence of static and dynamic components in human motion. We conducted a user study (N = 24) with two sessions to reidentify users across different VR sports and exercises after one week. We found that the identifiability of a kinetic signature depends on its inherent static and dynamic factors, with the best combination allowing for 90.91 % identification accuracy after one week had passed. Therefore, this work lays a foundation for designing and refining movement-based identification protocols in immersive environments.2024JLJonathan Liebers et al.University of Duisburg-EssenEye Tracking & Gaze InteractionHuman Pose & Activity RecognitionSocial & Collaborative VRCHI
Don’t Forget to Disinfect: Understanding Technology-Supported Hand Disinfection StationsThe global COVID-19 pandemic created a constant need for hand disinfection. While it is still essential, disinfection use is declining with the decrease in perceived personal risk (e.g., as a result of vaccination). Thus this work explores using different visual cues to act as reminders for hand disinfection. We investigated different public display designs using (1) paper-based only, adding (2) screen-based, or (3) projection-based visual cues. To gain insights into these designs, we conducted semi-structured interviews with passersby (N=30). Our results show that the screen- and projection-based conditions were perceived as more engaging. Furthermore, we conclude that the disinfection process consists of four steps that can be supported: drawing attention to the disinfection station, supporting the (subconscious) understanding of the interaction, motivating hand disinfection, and performing the action itself. We conclude with design implications for technology-supported disinfection.2023JKJonas Keppel et al.Context-Aware ComputingUbiquitous ComputingMobileHCI
Literature Reviews in HCI: A Review of ReviewsThis paper analyses Human-Computer Interaction (HCI) literature reviews to provide a clear conceptual basis for authors, reviewers, and readers. HCI is multidisciplinary and various types of literature reviews exist, from systematic to critical reviews in the style of essays. Yet, there is insufficient consensus of what to expect of literature reviews in HCI. Thus, a shared understanding of literature reviews and clear terminology is needed to plan, evaluate, and use literature reviews, and to further improve review methodology. We analysed 189 literature reviews published at all SIGCHI conferences and ACM Transactions on Computer-Human Interaction (TOCHI) up until August 2022. We report on the main dimensions of variation: (i) contribution types and topics; and (ii) structure and methodologies applied. We identify gaps and trends to inform future meta work in HCI and provide a starting point on how to move towards a more comprehensive terminology system of literature reviews in HCI.2023ESEvropi Stefanidi et al.University of BremenUser Research Methods (Interviews, Surveys, Observation)Research Ethics & Open ScienceCHI
HotFoot: Foot-Based User Identification using Thermal ImagingWe propose a novel method for seamlessly identifying users by combining thermal and visible feet features. While it is known that users’ feet have unique characteristics, these have so far been underutilized for biometric identification, as observing those features often requires the removal of shoes and socks. As thermal cameras are becoming ubiquitous, we foresee a new form of identification, using feet features and heat traces to reconstruct the footprint even while wearing shoes or socks. We collected a dataset of users’ feet (𝑁 = 21), wearing three types of footwear (personal shoes, standard shoes, and socks) on three floor types (carpet, laminate, and linoleum). By combining visual and thermal features, an AUC between 91.1% and 98.9%, depending on floor type and shoe type can be achieved, with personal shoes on linoleum floor performing best. Our findings demonstrate the potential of thermal imaging for continuous and unobtrusive user identification.2023ASAlia Saad et al.University of Duisburg-EssenHuman Pose & Activity RecognitionBiosensors & Physiological MonitoringCHI
How to Communicate Robot Motion Intent: A Scoping ReviewRobots are becoming increasingly omnipresent in our daily lives, supporting us and carrying out autonomous tasks. In Human-Robot Interaction, human actors benefit from understanding the robot's motion intent to avoid task failures and foster collaboration. Finding effective ways to communicate this intent to users has recently received increased research interest. However, no common language has been established to systematize robot motion intent. This work presents a scoping review aimed at unifying existing knowledge. Based on our analysis, we present an intent communication model that depicts the relationship between robot and human through different intent dimensions (intent type, intent information, intent location). We discuss these different intent dimensions and their interrelationships with different kinds of robots and human roles. Throughout our analysis, we classify the existing research literature along our intent communication model, allowing us to identify key patterns and possible directions for future research.2023MPMax Pascher et al.Westphalian University of Applied Sciences, University of Duisburg-EssenSocial Robot InteractionHuman-Robot Collaboration (HRC)CHI
VRception: Rapid Prototyping of Cross-Reality Systems in Virtual RealityCross-reality systems empower users to transition along the reality-virtuality continuum or collaborate with others experiencing different manifestations of it. However, prototyping these systems is challenging, as it requires sophisticated technical skills, time, and often expensive hardware. We present VRception, a concept and toolkit for quick and easy prototyping of cross-reality systems. By simulating all levels of the reality-virtuality continuum entirely in Virtual Reality, our concept overcomes the asynchronicity of realities, eliminating technical obstacles. Our VRception Toolkit leverages this concept to allow rapid prototyping of cross-reality systems and easy remixing of elements from all continuum levels. We replicated six cross-reality papers using our toolkit and presented them to their authors. Interviews with them revealed that our toolkit sufficiently replicates their core functionalities and allows quick iterations. Additionally, remote participants used our toolkit in pairs to collaboratively implement prototypes in about eight minutes that they would have otherwise expected to take days.2022UGUwe Gruenefeld et al.University of Duisburg-EssenMixed Reality WorkspacesImmersion & Presence ResearchCHI
Understanding User Identification in Virtual Reality through Behavioral Biometrics and the Effect of Body NormalizationVirtual Reality (VR) is becoming increasingly popular both in the entertainment and professional domains. Behavioral biometrics have recently been investigated as a means to continuously and implicitly identify users in VR. Applications in VR can specifically benefit from this, for example, to adapt virtual environments and user interfaces as well as to authenticate users. In this work, we conduct a lab study (N=16) to explore how accurately users can be identified during two task-driven scenarios based on their spatial movement. We show that an identification accuracy of up to 90 % is possible across sessions recorded on different days. Moreover, we investigate the role of users' physiology in behavioral biometrics by virtually altering and normalizing their body proportions. We find that body normalization in general increases the identification rate, in some cases by up to 38 %; hence, it improves the performance of identification systems.2021JLJonathan Liebers et al.University of Duisburg-EssenHuman Pose & Activity RecognitionIdentity & Avatars in XRCHI
Around the (Virtual) World: Infinite Walking in Virtual Reality Using Electrical Muscle StimulationVirtual worlds are infinite environments in which the user can move around freely. When shifting from controller-based movement to regular walking as an input, the limitation of the real world also limits the virtual world. Tackling this challenge, we propose the use of electrical muscle stimulation to limit the necessary real-world space to create an unlimited walking experience. We thereby actuate the users' legs in a way that they deviate from their straight route and thus, walk in circles in the real world while still walking straight in the virtual world. We report on a study comparing this approach to vision shift - the state of the art approach - as well as combining both approaches. The results show that particularly combining both approaches yield high potential to create an infinite walking experience.2019JAJonas Auda et al.University of Duisburg-EssenElectrical Muscle Stimulation (EMS)Full-Body Interaction & Embodied InputCHI
Understanding the Impact of Information Representation on Willingness to Share InformationSince the release of the first activity tracker, there has been a steady increase in the number of sensors embedded in wearable devices and with it in the amount and diversity of information that can be derived from these sensors. This development leads to novel privacy threats for users. In a web survey with 248 participants, we explored whether users' willingness to share private data is dependent on how the data is requested by an application. Specifically, requests can be formulated as access to sensor data or as access to information derived from the sensor data (e.g., accelerometer vs. sleep quality). We show that non-expert users lack an understanding of how the two representation levels relate to each other. The results suggest that the willingness to share sensor data over derived information is governed by whether the derived information has positive or negative connotations (e.g., training intensity vs. life expectancy). Using the results of the survey, we derive implications for supporting users in protecting their private data collected via wearable sensors.2019SSStefan Schneegass et al.University of Duisburg-EssenPrivacy by Design & User ControlPrivacy Perception & Decision-MakingCHI
P1 - Design Guidelines for Reliability Communication in Autonomous VehiclesCurrently offered autonomous vehicles still require the human intervention. For instance, when the system fails to perform as expected or adapts to unanticipated situations. Given that reliability of autonomous systems can fluctuate across conditions, this work is a first step towards understanding how this information ought to be communicated to users. We conducted a user study to investigate the effect of communicating the system's reliability through a feedback bar. Subjective feedback was solicited from participants with questionnaires and semi-structured interviews. Based on the qualitative results, we derived guidelines that serve as a foundation for the design of how autonomous systems could provide continuous feedback on their reliability.2018SFSarah Faltaous et al.Automated Driving Interface & Takeover DesignAutoUI
Navigation Systems for Motorcyclists: Exploring Wearable Tactile Feedback for Route Guidance in the Real WorldCurrent navigation systems for motor cyclists use visual or auditory cues for guidance. However, this poses a challenge to the motorcyclists since their visual and auditory channels are already occupied with controlling the motorbike, paying attention to other road users, and planing the next turn. In this work, we explore how tactile feedback can be used to guide motorcyclists. We present MOVING (MOtorbike VIbrational Navigation Guidance), a smart kidney belt that presents navigation cues through 12 vibration motors. In addition, we report on the design process of this wearable and on an evaluation with 16 participants in a real world riding setting. We show that MOVING outperforms off-the-shelf navigation systems in terms of turn errors and distraction.2018FKFrancisco Esteban Kiss et al.University of StuttgartIn-Vehicle Haptic, Audio & Multimodal FeedbackMicromobility (E-bike, E-scooter) InteractionVibrotactile Feedback & Skin StimulationCHI