Understanding the Influence of Electrical Muscle Stimulation on Motor Learning: Enhancing Motor Learning or Disrupting Natural Progression?Electrical Muscle Stimulation (EMS) induces muscle movement through external currents, offering a novel approach to motor learning. Researchers investigated using EMS as an alternative to conventional non-movement-inducing feedback techniques, such as vibrotactile and electrotactile feedback. While EMS shows promise in areas such as dance, sports, and motor skill acquisition, neurophysiological models of motor learning conflict about the impact of externally induced movements on sensorimotor representations. This study evaluated EMS against electrotactile feedback and a control condition in a two-session experiment assessing fast learning, consolidation, and learning transfer. Our results suggest an overall positive impact of EMS in motor learning. Although traditional electrotactile feedback had a higher learning rate, EMS increased the learning plateau, as measured by a three-factor exponential decay model. This study provides empirical evidence supporting EMS as a plausible method for motor augmentation and skill transfer, contributing to understanding its role in motor learning.2025SVSteeven Villa et al.LMU MunichVibrotactile Feedback & Skin StimulationElectrical Muscle Stimulation (EMS)CHI
An Approach to Elicit Human-Understandable Robot Expressions to Support Human-Robot InteractionUnderstanding the intentions of robots is essential for natural and seamless human-robot collaboration. Ensuring that robots have means for non-verbal communication is a basis for intuitive and implicit interaction. For this, we describe an approach to elicit and design human-understandable robot expressions. We outline the approach in the context of non-humanoid robots. We paired human mimicking and enactment with research from gesture elicitation in two phases: first, to elicit expressions, and second, to ensure they are understandable. We present an example application through two studies (N=16 \& N=260) of our approach to elicit expressions for a simple 6-DoF robotic arm. We show that the approach enabled us to design robot expressions that signal curiosity and interest in getting attention. Our main contribution is an approach to generate and validate understandable expressions for robots, enabling more natural human-robot interaction.2025JLJan Leusmann et al.LMU MunichHand Gesture RecognitionSocial Robot InteractionHuman-Robot Collaboration (HRC)CHI
Developing and Validating the Perceived System Curiosity Scale (PSC): Measuring Users' Perceived Curiosity of SystemsLike humans, today's systems, such as robots and voice assistants, can express curiosity to learn and engage with their surroundings. While curiosity is a well-established human trait that enhances social connections and drives learning, no existing scales assess the perceived curiosity of systems. Thus, we introduce the Perceived System Curiosity (PSC) scale to determine how users perceive curious systems. We followed a standardized process of developing and validating scales, resulting in a validated 12-item scale with 3 individual sub-scales measuring explorative, investigative, and social dimensions of system curiosity. In total, we generated 831 items based on literature and recruited 414 participants for item selection and 320 additional participants for scale validation. Our results show that the PSC scale has inter-item reliability and convergent and construct validity. Thus, this scale provides an instrument to explore how perceived curiosity influences interactions with technical systems systematically.2025JLJan Leusmann et al.LMU MunichBrain-Computer Interface (BCI) & NeurofeedbackAgent Personality & AnthropomorphismGenerative AI (Text, Image, Music, Video)CHI
Embracer: A Wearable Encountered-Type Haptic Controller for 3 DoF Input and FeedbackThe lack of haptic sensations beyond very simple vibration feedback diminishes the feeling of presence in Virtual Reality. Research suggested various approaches to deliver haptic sensations to the user's palm. However, these approaches are typically limited in the number of actuation directions and only focus on enhancing the system's output, ignoring haptic input. We present \systemname{}, a wrist-mounted encountered-type haptic controller that addresses these gaps by rendering forces along three axes through a sphere-shaped end effector within the user's palm. Using modified servo motors, we sense user-performed manipulations of the end effector as an input modality. In this paper, we contribute the design and implementation of \systemname{} together with a preliminary technical evaluation. By providing a more comprehensive haptic feedback system, \systemname{} enhances the realism and immersion of haptic feedback and user control.2024DDDennis Dietz et al.Force Feedback & Pseudo-Haptic WeightHaptic WearablesUbiComp
Exploring Redirection and Shifting Techniques to Mask Hand Movements from Shoulder-Surfing Attacks during PIN Authentication in Virtual RealityThe proliferation of mobile Virtual Reality (VR) headsets shifts our interaction with virtual worlds beyond our living rooms into shared spaces. Consequently, we are entrusting more and more personal data to these devices, calling for strong security measures and authentication. However, the standard authentication method of such devices - entering PINs via virtual keyboards - is vulnerable to shoulder-surfing, as movements to enter keys can be monitored by an unnoticed observer. To address this, we evaluated masking techniques to obscure VR users' input during PIN authentication by diverting their hand movements. Through two experimental studies, we demonstrate that these methods increase users' security against shoulder-surfing attacks from observers without excessively impacting their experience and performance. With these discoveries, we aim to enhance the security of future VR authentication without disrupting the virtual experience or necessitating additional hardware or training of users.2024YWYannick Weiss et al.Passwords & AuthenticationMobileHCI
An Examination of Ultrasound Mid-air Haptics for Enhanced Material and Temperature Perception in Virtual EnvironmentsRendering realistic tactile sensations of virtual objects remains a challenge in VR. While haptic interfaces have advanced, particularly with phased arrays, their ability to create realistic object properties like state and temperature remains unclear. This study investigates the potential of Ultrasound Mid-air Haptics (UMH) for enhancing the perceived congruency of virtual objects. In a user study with 30 participants, we assessed how UMH impacts the perceived material state and temperature of virtual objects. We also analyzed EEG data to understand how participants integrate UMH information physiologically. Our results reveal that UMH significantly enhances the perceived congruency of virtual objects, particularly for solid objects, reducing the feeling of mismatch between visual and tactile feedback. Additionally, UMH consistently increases the perceived temperature of virtual objects. These findings offer valuable insights for haptic designers, demonstrating UMH's potential for creating more immersive tactile experiences in VR by addressing key limitations in current haptic technologies.2024SVSteeven Villa et al.Mid-Air Haptics (Ultrasonic)MobileHCI
"AI enhances our performance, I have no doubt this one will do the same": The Placebo effect is robust to negative descriptions of AIHeightened AI expectations facilitate performance in human-AI interactions through placebo effects. While lowering expectations to control for placebo effects is advisable, overly negative expectations could induce nocebo effects. In a letter discrimination task, we informed participants that an AI would either increase or decrease their performance by adapting the interface, when in reality, no AI was present in any condition. A Bayesian analysis showed that participants had high expectations and performed descriptively better irrespective of the AI description when a sham-AI was present. Using cognitive modeling, we could trace this advantage back to participants gathering more information. A replication study verified that negative AI descriptions do not alter expectations, suggesting that performance expectations with AI are biased and robust to negative verbal descriptions. We discuss the impact of user expectations on AI interactions and evaluation.2024AKAgnes Mercedes Kloft et al.Aalto UniversityExplainable AI (XAI)AI-Assisted Decision-Making & AutomationAI Ethics, Fairness & AccountabilityCHI
Society’s Attitudes Towards Human Augmentation and Performance Enhancement Technologies (SHAPE) Scale"Human augmentation technologies (ATs) are a subset of ubiquitous on-body devices designed to improve cognitive, sensory, and motor capacities. Although there is a large corpus of knowledge concerning ATs, less is known about societal attitudes towards them and how they shift over time. To that end, we developed The Society's Attitudes Towards Human Augmentation and Performance Enhancement Technologies (SHAPE) Scale, which measures how users of ATs are perceived. To develop the scale, we first created a list of possible scale items based on past work on how people respond to new technologies. The items were then reviewed by experts. Next, we performed exploratory factor analysis to reduce the scale to its final length of thirteen items. Subsequently, we confirmed test-retest validity of our instrument, as well as its construct validity. The SHAPE scale enables researchers and practitioners to understand elements contributing to attitudes toward augmentation technology users. The SHAPE scale assists designers of ATs in designing artifacts that will be more universally accepted." https://doi.org/10.1145/36109152023SVSteeven Villa et al.Brain-Computer Interface (BCI) & NeurofeedbackMotor Impairment Assistive Input TechnologiesUbiComp
Towards a Haptic Taxonomy of Emotions: Exploring Vibrotactile Stimulation in the Dorsal RegionThe implicit communication of emotional states between persons is a key use case for novel assistive and augmentation technologies. It can serve to expand individuals' perceptual capabilities and assist neurodivergent individuals. Notably, vibrotactile rendering is a promising method for delivering emotional information with minimal interference with visual or auditory perception. To date, the subjective individual association between vibrotactile properties and emotional states remains unclear. Previous approaches relied on analogies or arbitrary variations, limiting generalization. To address this, we conducted a study with 40 participants, analyzing associations between attributes of self-generated vibrotactile patterns (\textsc{amplitude}, \textsc{frequency}, \textsc{spatial location} of stimulation) and four emotional states (\textsc{Anger}, \textsc{Happiness}, \textsc{Neutral}, \textsc{Sadness}). We fin a preference for symmetrically arranged patterns, as well as distinct amplitude and frequency profiles for different emotions. These insights can aid in creating standardized vibrotactile patterns for universal emotional communication.2023SVSteeven Villa et al.Vibrotactile Feedback & Skin StimulationUbiComp
SensCon: Embedding Physiological Sensing into Virtual Reality ControllersVirtual reality experiences increasingly use physiological data for virtual environment adaptations to evaluate user experience and immersion. Previous research required complex medical-grade equipment to collect physiological data, limiting real-world applicability. To overcome this, we present SensCon for skin conductance and heart rate data acquisition. To identify the optimal sensor location in the controller, we conducted a first study investigating users' controller grasp behavior. In a second study, we evaluated the performance of SensCon against medical-grade devices in six scenarios regarding user experience and signal quality. Users subjectively preferred SensCon in terms of usability and user experience. Moreover, the signal quality evaluation showed satisfactory accuracy across static, dynamic, and cognitive scenarios. Therefore, SensCon reduces the complexity of capturing and adapting the environment via real-time physiological data. By open-sourcing SensCon, we enable researchers and practitioners to adapt their virtual reality environment effortlessly. Finally, we discuss possible use cases for virtual reality-embedded physiological sensing.2023FCFrancesco Chiossi et al.Immersion & Presence ResearchBiosensors & Physiological MonitoringContext-Aware ComputingMobileHCI
Towards an Implicit Metric of Sensory-Motor Accuracy: Brain Responses to Auditory Prediction Errors in PianistsDuring listening to music, the brain expects specific acoustic events based on learned musical rules. During music performance expectancy is additionally created based on motor action by linking keypresses to their sounds. We investigated EEG (Electroencephalography) signals to auditory expectancy violations in piano performance and perception. In our study, pianists experienced manipulations of different acoustic features, such as pitch and loudness, during playing and listening to piano sequences. We found that manipulations during performance elicited deflections with stronger amplitudes compared to manipulations during perception indicating that the action of producing sounds strengthens auditory expectancy. Loudness manipulations, violating musical regularity, elicited deflections with smaller latencies compared to pitch manipulations, which violate harmonic expectancy, suggesting that the brain processes expectancy violations of distinct acoustic features in a different way. These EEG signatures may prove useful for applications in intelligent music interfaces by providing information about sensory-motor accuracy.2023EPElisabeth Pangratz et al.Brain-Computer Interface (BCI) & NeurofeedbackBiosensors & Physiological MonitoringC&C
Using Pseudo-Stiffness to Enrich the Haptic Experience in Virtual RealityProviding users with a haptic sensation of the hardness and softness of objects in virtual reality is an open challenge. While physical props and haptic devices help, their haptic properties do not allow for dynamic adjustments. To overcome this limitation, we present a novel technique for changing the perceived stiffness of objects based on a visuo-haptic illusion. We achieved this by manipulating the hands' Control-to-Display (C/D) ratio in virtual reality while pressing down on an object with fixed stiffness. In the first study (N=12), we determine the detection thresholds of the illusion. Our results show that we can exploit a C/D ratio from 0.7 to 3.5 without user detection. In the second study (N=12), we analyze the illusion's impact on the perceived stiffness. Our results show that participants perceive the objects to be up to 28.1% softer and 8.9% stiffer, allowing for various haptic applications in virtual reality.2023YWYannick Weiss et al.LMU MunichForce Feedback & Pseudo-Haptic WeightImmersion & Presence ResearchCHI
Understanding the Perception of Human Augmentation: A Mixed-Method StudyTechnologies that help users overcome their limitations and integrate with the human body are often termed ``human augmentations''. Such technologies are now available on the consumer market, potentially supporting people in their everyday activities. To date, there is no systematic understanding of the perception of human augmentations yet. To address this gap and build an understanding of how to design positive experiences with human augmentations, we conducted a mixed-method study of the perception of augmented humans (AHs). We conducted two scenario-based studies: interviews ($n=16$) and an online study ($n=506$) with participants from four countries. The scenarios include one out of three augmentation categories (sensory, motor, and cognitive) and specify if the augmented person has a disability or not. Overall, results show that the type of augmentation and disability impacted user attitudes towards AHs. We derive design dimensions for creating technological augmentations for a diverse and global audience.2023SVSteeven Villa et al.LMU MunichMotor Impairment Assistive Input TechnologiesCognitive Impairment & Neurodiversity (Autism, ADHD, Dyslexia)Inclusive DesignCHI