Move with Style! Enhancing Avatar Embodiment in Virtual Reality through Proprioceptive Motion FeedbackIn virtual reality (VR), users slip into a variety of roles, represented by a rich diversity of avatars that each exhibit specific visual attributes and motion styles. While users can see their avatar's motion in VR, they usually cannot feel it. To enhance avatar embodiment, we propose active proprioceptive feedback that aligns users' physical movements with the expected motion style of their avatar, for instance, by mimicking the avatar's weight, typical motion speed or motion range. We introduce a conceptual space of relevant motion properties which enable designers to create expressive proprioceptive motion styles for avatars. We instantiate this concept with MotionStyler: a system for designing customized motion styles and rendering them in real-time with an arm-based exoskeleton that is synchronized with the VR avatar. Results from a survey confirmed the expressiveness of the proposed conceptual space. A user study demonstrated the system's capability to create diverse proprioceptive motion styles which enhance user's self-identification with their avatar and thereby positively contribute to avatar embodiment in VR.2025DWDavid Wagmann et al.Force Feedback & Pseudo-Haptic WeightIdentity & Avatars in XRUIST
eTactileKit: A Toolkit for Design Exploration and Rapid Prototyping of Electro-Tactile InterfacesElectro-tactile interfaces are becoming increasingly popular due to their unique advantages, such as delivering fast and localised tactile response, thin and flexible form factors, and the potential to create novel tactile experiences. However, insights from a formative study with typical designers highlighted the lack of resources, limited access to information and complexity of software and hardware tools. This establishes a high barrier to entry and limits the ability to rapidly prototype and experiment with electro-tactile interfaces. To address these challenges, we propose eTactileKit, a scalable and accessible toolkit providing end-to-end support for designing and prototyping electro-tactile interfaces. eTactileKit comprises a hardware platform and a software framework for designing, simulating and exploring electro-tactile stimuli. We evaluated the impact and usability of eTactileKit through a three-week long take-home study, which demonstrated increased accessibility, ease of use, and the toolkit's positive impact on design workflow. Additionally, we implemented a set of use cases to demonstrate the toolkit's practicality and effectiveness across various applications.2025PPPraneeth Bimsara Perera et al.Electrical Muscle Stimulation (EMS)Prototyping & User TestingUIST
GestureCoach: Rehearsing for Engaging Talks with LLM-Driven Gesture RecommendationsThis paper introduces GestureCoach, a system designed to help speakers deliver more engaging talks by guiding them to gesture effectively during rehearsal. GestureCoach combines an LLM-driven gesture recommendation model with a rehearsal interface that proactively cues speakers to gesture appropriately. Trained on experts’ gesturing patterns from TED talks, the model consists of two modules: an emphasis proposal module, which predicts when to gesture by identifying gesture-worthy text segments in the presenter notes, and a gesture identification module, which determines what gesture to use by retrieving semantically appropriate gestures from a curated gesture database. Results of a model performance evaluation and user study (N=30) show that the emphasis proposal module outperforms off-the-shelf LLMs in identifying suitable gesture regions, and that participants rated the majority of these predicted regions and their corresponding gestures as highly appropriate. A subsequent user study (N=10) showed that rehearsing with GestureCoach encouraged speakers to gesture and significantly increased gesture diversity, resulting in more engaging talks. We conclude with design implications for future AI-driven rehearsal systems.2025ARAshwin Ram et al.Hand Gesture RecognitionHuman-LLM CollaborationCreative Collaboration & Feedback SystemsUIST
Delusionized? Potential Harms of Proprioceptive Manipulations through Hand Redirection in Virtual RealityTo enhance interactions in VR, hand redirection (HR)-based illusion techniques apply offsets between the virtual and real-world position of users’ hands. While adaptation to such HR offsets is recognized, their impact on proprioception accuracy remains unexplored. However, deploying HR without understanding its potential effects on proprioception accuracy may pose risks to users in real-life situations. To investigate this, we conducted an experiment with 22 participants, studying the influence of prolonged exposure to unnoticeable HR offsets on proprioceptive accuracy during hand-reaching in VR. Our results show that proprioceptive accuracy declines significantly after prolonged exposure to redirected hand interactions. However, short-time exposure to unaltered hand interactions can – yet only partially – restore normal levels. Thus, we advocate being aware of potential risks arising from prolonged exposure to visual-proprioceptive offsets to ensure users’ safety.2025MFMartin Feick et al.Haptic WearablesHand Gesture RecognitionUIST
From Pegs to Pixels: A Comparative Analysis of the Nine Hole Peg Test and a Digital Copy Drawing Test for Fine Motor Control AssessmentUser interaction with digital systems requires Fine Motor Control (FMC), especially if the interfaces are complex or require high fidelity and fine-grained interactions. Despite its importance, Fine Motor Control is often overlooked in interactive system design, partly because of its complex assessment. Measuring changes in fine motor abilities due to prolonged use or fatigue currently requires repeated manual testing. This paper analyzes the concept of using the digital mobile devices' input behavior to assess the user's Fine Motor Control. For this, we show that Fine Motor Control can be assessed for touch and stylus-based interaction with a digital mobile system. We conducted a user study, where participants performed a Nine Hole Peg Test and a predefined Copy Drawing Test before and after exercises that affect fine motor skills. Based on this data, we investigated how metrics such as pressure, velocity, and entropy for touch and stylus input can be used to predict Fine Motor Control.2025DSDominik Schön et al.Motor Impairment Assistive Input TechnologiesPrototyping & User TestingMobileHCI
CreepyCoCreator? Investigating AI Representation Modes for 3D Object Co-Creation in Virtual RealityGenerative AI in Virtual Reality offers the potential for collaborative object-building, yet challenges remain in aligning AI contributions with user expectations. In particular, users often struggle to understand and collaborate with AI when its actions are not transparently represented. This paper thus explores the co-creative object-building process through a Wizard-of-Oz study, focusing on how AI can effectively convey its intent to users during object customization in Virtual Reality. Inspired by human-to-human collaboration, we focus on three representation modes: the presence of an embodied avatar, whether the AI’s contributions are visualized immediately or incrementally, and whether the areas modified are highlighted in advance. The findings provide insights into how these factors affect user perception and interaction with object-generating AI tools in Virtual Reality as well as satisfaction and ownership of the created objects. The results offer design implications for co-creative world-building systems, aiming to foster more effective and satisfying collaborations between humans and AI in Virtual Reality.2025JRJulian Rasch et al.LMU MunichMixed Reality WorkspacesCreative Collaboration & Feedback SystemsCHI
IntelliLining: Activity Sensing through Textile Interlining Sensors Using TENGsWe introduce a novel component for smart garments: smart interlining, and validate its technical feasibility through a series of experiments. Our work involved the implementation of a prototype that employs a textile vibration sensor based on Triboelectric Nanogenerators (TENGs), commonly used for activity detection. We explore several unique features of smart interlining, including how sensor signals and patterns are influenced by factors such as the size and shape of the interlining sensor, the location of the vibration source within the sensor area, and various propagation media, such as airborne and surface vibrations. We present our study results and discuss how these findings support the feasibility of smart interlining. Additionally, we demonstrate that smart interlinings on a shirt can detect a variety of user activities involving the hand, mouth, and upper body, achieving an accuracy rate of 93.9% in the tested activities.2025MEMahdie Ghane Ezabadi et al.Simon Fraser University, Computing ScienceHaptic WearablesElectronic Textiles (E-textiles)CHI
ExoKit: A Toolkit for Rapid Prototyping of Interactions for Arm-based ExoskeletonsExoskeletons open up a unique interaction space that seamlessly integrates users' body movements with robotic actuation. Despite its potential, human-exoskeleton interaction remains an underexplored area in HCI, largely due to the lack of accessible prototyping tools that enable designers to easily develop exoskeleton designs and customized interactive behaviors. We present ExoKit, a do-it-yourself toolkit for rapid prototyping of low-fidelity, functional exoskeletons targeted at novice roboticists. ExoKit includes modular hardware components for sensing and actuating shoulder and elbow joints, which are easy to fabricate and (re)configure for customized functionality and wearability. To simplify the programming of interactive behaviors, we propose functional abstractions that encapsulate high-level human-exoskeleton interactions. These can be readily accessed either through ExoKit's command-line or graphical user interface, a Processing library, or microcontroller firmware, each targeted at different experience levels. Findings from implemented application cases and two usage studies demonstrate the versatility and accessibility of ExoKit for early-stage interaction design.2025MMMarie Muehlhaus et al.Saarland Informatics Campus, Saarland UniversityForce Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsCircuit Making & Hardware PrototypingCHI
3HANDS Dataset: Learning from Humans for Generating Naturalistic Handovers with Supernumerary Robotic LimbsSupernumerary robotic limbs are robotic structures integrated closely with the user's body, which augment human physical capabilities and necessitate seamless, naturalistic human-machine interaction. For effective assistance in physical tasks, enabling SRLs to hand over objects to humans is crucial. Yet, designing heuristic-based policies for robots is time-consuming, difficult to generalize across tasks, and results in less human-like motion. When trained with proper datasets, generative models are powerful alternatives for creating naturalistic handover motions. We introduce 3HANDS, a novel dataset of object handover interactions between a participant performing a daily activity and another participant enacting a hip-mounted SRL in a naturalistic manner. 3HANDS captures the unique characteristics of SRL interactions: operating in intimate personal space with asymmetric object origins, implicit motion synchronization, and the user’s engagement in a primary task during the handover. To demonstrate the effectiveness of our dataset, we present three models: one that generates naturalistic handover trajectories, another that determines the appropriate handover endpoints, and a third that predicts the moment to initiate a handover. In a user study (N=10), we compare the handover interaction performed with our method compared to a baseline. The findings show that our method was perceived as significantly more natural, less physically demanding, and more comfortable.2025AAArtin Saberpour Abadian et al.Saarland University, Saarland Informatics CampusTeleoperated DrivingHuman-Robot Collaboration (HRC)CHI
PrivateGaze: Preserving User Privacy in Black-box Mobile Gaze Tracking ServicesDu 等人开发 PrivateGaze 系统,通过在黑盒移动眼球追踪服务中引入隐私保护机制,防止用户眼球数据被泄露。2024LDLingyu Du et al.Eye Tracking & Gaze InteractionPrivacy by Design & User ControlUbiComp
Predicting the Limits: Tailoring Unnoticeable Hand Redirection Offsets in Virtual Reality to Individuals’ Perceptual BoundariesMany illusion and interaction techniques in Virtual Reality (VR) rely on Hand Redirection (HR), which has proved to be effective as long as the introduced offsets between the position of the real and virtual hand do not noticeably disturb the user experience. Yet calibrating HR offsets is a tedious and time-consuming process involving psychophysical experimentation, and the resulting thresholds are known to be affected by many variables---limiting HR's practical utility. As a result, there is a clear need for alternative methods that allow tailoring HR to the perceptual boundaries of individual users. We conducted an experiment with 18 participants combining movement, eye gaze and EEG data to detect HR offsets Below, At, and Above individuals' detection thresholds. Our results suggest that we can distinguish HR At and Above from no HR. Our exploration provides a promising new direction with potentially strong implications for the broad field of VR illusions.2024MFMartin Feick et al.Full-Body Interaction & Embodied InputEye Tracking & Gaze InteractionBrain-Computer Interface (BCI) & NeurofeedbackUIST
Embrogami: Shape-Changing Textiles with Machine EmbroideryMachine embroidery is a versatile technique for creating custom and entirely fabric-based patterns on thin and conformable textile surfaces. However, existing machine-embroidered surfaces remain static, limiting the interactions they can support. We introduce Embrogami, an approach for fabricating textile structures with versatile shape-changing behaviors. Inspired by origami, we leverage machine embroidery to form finger-tip-scale mountain-and-valley structures on textiles with customized shapes, bistable or elastic behaviors, and modular composition. The structures can be actuated by the user or the system to modify the local textile surface topology, creating interactive elements like toggles and sliders or textile shape displays with an ultra-thin, flexible, and integrated form factor. We provide a dedicated software tool and report results of technical experiments to allow users to flexibly design, fabricate, and deploy customized Embrogami structures. With four application cases, we showcase Embrogami’s potential to create functional and flexible shape-changing textiles with diverse visuo-tactile feedback.2024YJYu Jiang et al.Haptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
Mental Models, Expectations and Implications of Client-Side Scanning: An Interview Study with ExpertsClient-Side Scanning (CSS) is discussed as a potential solution to contain the dissemination of child sexual abuse material (CSAM). A significant challenge associated with this debate is that stakeholders have different interpretations of the capabilities and frontiers of the concept and its varying implementations. In this paper, we explore stakeholders' understandings of the technology and the expectations and potential implications in the context of CSAM by conducting and analyzing 28 semi-structured interviews with a diverse sample of experts. We identified mental models of CSS and the expected challenges. Our results show that CSS is often a preferred solution in the child sexual abuse debate due to the lack of an alternative. Our findings illustrate the importance of further interdisciplinary discussions to define and comprehend the impact of CSS usage on society, particularly vulnerable groups such as children.2024DBDivyanshu Bhardwaj et al.CISPA Helmholtz Center for Information SecurityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingTechnology Ethics & Critical HCICHI
Beyond the Blink: Investigating Combined Saccadic & Blink-Suppressed Hand Redirection in Virtual RealityIn pursuit of hand redirection techniques that are ever more tailored to human perception, we propose the first algorithm for hand redirection in virtual reality that makes use of saccades, i.e., fast ballistic eye movements that are accompanied by the perceptual phenomenon of change blindness. Our technique combines the previously proposed approaches of gradual hand warping and blink-suppressed hand redirection with the novel approach of saccadic redirection in one unified yet simple algorithm. We compare three variants of the proposed Saccadic & Blink-Suppressed Hand Redirection (SBHR) technique with the conventional approach to redirection in a psychophysical study (N=25). Our results highlight the great potential of our proposed technique for comfortable redirection by showing that SBHR allows for significantly greater magnitudes of unnoticeable redirection while being perceived as significantly less intrusive and less noticeable than commonly employed techniques that only use gradual hand warping.2024AZAndré Zenner et al.Saarland University, Saarland Informatics Campus, German Research Center for Artificial Intelligence (DFKI), Saarland Informatics CampusHand Gesture RecognitionFull-Body Interaction & Embodied InputEye Tracking & Gaze InteractionCHI
Flextiles: Designing Customisable Shape-Change in Textiles with SMA-Actuated Smocking PatternsShape Memory Alloys (SMAs) afford the seamless integration of shape-changing behaviour into textiles, enabling designers to augment apparel with dynamic shaping and styling. However, existing works fall short of providing versatile methods adaptable to varying scales, materials, and applications, curtailing designers’ capacity to prototype customised solutions. To address this, we introduce Flextiles, parameterised SMA design schema that leverage the traditional craft of smocking to integrate planar shape-change seamlessly into diverse textile projects. The conception of Flextiles stems from material experimentation and consultative dialogues with designers, whose insights inspired strategies for customising scale, elasticity, geometry, and actuation of Flextiles. To support the practical implementation of Flextiles, we provide a design tool and experimentally characterise their material properties. Lastly, through a design case study with practitioners, we explore the multifaceted applications and perspectives surrounding Flextiles, and subsequently realise four scenarios that illustrate the creative potential of these modular, customisable patterns.2024AHAlice C Haynes et al.Saarland Informatics CampusHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingCHI
The Impact of Avatar Completeness on Embodiment and the Detectability of Hand Redirection in Virtual RealityTo enhance interactions in VR, many techniques introduce offsets between the virtual and real-world position of users’ hands. Nevertheless, such hand redirection (HR) techniques are only effective as long as they go unnoticed by users—not disrupting the VR experience. While several studies consider how much unnoticeable redirection can be applied, these focus on mid-air floating hands that are disconnected from users’ bodies. Increasingly, VR avatars are embodied as being directly connected with the user’s body, which provide more visual cue anchoring, and may therefore reduce the unnoticeable redirection threshold. In this work, we studied more complete avatars and their effect on the sense of embodiment and the detectability of HR. We found that higher avatar completeness increases embodiment, and we provide evidence for the absence of practically relevant effects on the detectability of HR.2024MFMartin Feick et al.DFKI, Saarland Informatics CampusForce Feedback & Pseudo-Haptic WeightFull-Body Interaction & Embodied InputIdentity & Avatars in XRCHI
In Focus, Out of Privacy: The Wearer's Perspective on the Privacy Dilemma of Camera GlassesThe rising popularity of camera glasses challenges societal norms of recording bystanders and thus requires efforts to mediate privacy preferences. We present the first study on the wearers' perspectives and explore privacy challenges associated with wearing camera glasses when bystanders are present. We conducted a micro-longitudinal diary study (N=15) followed by exit interviews with existing users and people without prior experience. Our results show that wearers consider the currently available privacy indicators ineffective. They believe the looks and interaction design of the glasses conceal the technology from unaware people. Due to the lack of effective privacy-mediating measures, wearers feel emotionally burdened with preserving bystanders' privacy. We furthermore elicit how this sentiment impacts their usage of camera glasses and highlight the need for technical and non-technical solutions. Finally, we compare the wearers' and bystanders' perspectives and discuss the design space of a future privacy-preserving ecosystem for wearable cameras.2024DBDivyanshu Bhardwaj et al.CISPA Helmholtz Center for Information SecurityPrivacy by Design & User ControlPrivacy Perception & Decision-MakingParticipatory DesignCHI
Shaping Compliance: Inducing Haptic Illusion of Compliance in Different Shapes with Electrotactile GrainsCompliance, the degree of displacement under applied force, is pivotal in determining the material perception when touching an object. Vibrotactile actuators can be used for creating grain-based virtual compliance, but they have poor spatial resolution and a limiting rigid form factor. We propose a novel electrotactile compliance illusion that renders grains of electrical pulses on an electrode array in response to finger force changes. We demonstrate its ability to render compliance in distinct shapes through a thin, lightweight, and flexible finger-worn interface. Detailed technical parameters and the implementation of our device are provided. A controlled experiment confirms the technique can (1) create virtual compliance; (2) adjust the compliance magnitude with grain and electrode parameters; and (3) render compliance with specific shapes. In three example applications, we present how this illusion can enhance physical objects, elements in graphical user interfaces, and virtual reality experiences.2024AJArata Jingu et al.Saarland Informatics CampusVibrotactile Feedback & Skin StimulationHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsCHI
Grand Challenges in SportsHCIThe field of Sports Human-Computer Interaction (SportsHCI) investigates interaction design to support a physically active human being. Despite growing interest and dissemination of SportsHCI literature over the past years, many publications still focus on solving specific problems in a given sport. We believe in the benefit of generating fundamental knowledge for SportsHCI more broadly to advance the field as a whole. To achieve this, we aim to identify the grand challenges in SportsHCI, which can help researchers and practitioners in developing a future research agenda. Hence, this paper presents a set of grand challenges identified in a five-day workshop with 22 experts who have previously researched, designed, and deployed SportsHCI systems. Addressing these challenges will drive transformative advancements in SportsHCI, fostering better athlete performance, athlete-coach relationships, spectator engagement, but also immersive experiences for recreational sports or exercise motivation, and ultimately, improve human well-being.2024DEDon Samitha Elvitigala et al.Monash UniversityGame UX & Player BehaviorSerious & Functional GamesMental Health Apps & Online Support CommunitiesCHI
VoxelHap: A Toolkit for Constructing Proxies Providing Tactile and Kinesthetic Haptic Feedback in Virtual RealityExperiencing virtual environments is often limited to abstract interactions with objects. Physical proxies allow users to feel virtual objects, but are often inaccessible. We present the VoxelHap toolkit which enables users to construct highly functional proxy objects using Voxels and Plates. Voxels are blocks with special functionalities that form the core of each physical proxy. Plates increase a proxy’s haptic resolution, such as its shape, texture or weight. Beyond pro- viding physical capabilities to realize haptic sensations, VoxelHap utilizes VR illusion techniques to expand its haptic resolution. We evaluated the capabilities of the VoxelHap toolkit through the construction of a range of fully functional proxies across a variety of use cases and applications. In two experiments with 24 participants, we investigate a subset of the constructed proxies, studying how they compare to a traditional VR controller. First, we investigated VoxelHap’s combined haptic feedback and second, the trade-offs of using ShapePlates. Our findings show that VoxelHap’s proxies outperform traditional controllers and were favored by participants.2023MFMartin Feick et al.In-Vehicle Haptic, Audio & Multimodal FeedbackShape-Changing Interfaces & Soft Robotic MaterialsImmersion & Presence ResearchUIST