Move with Style! Enhancing Avatar Embodiment in Virtual Reality through Proprioceptive Motion FeedbackIn virtual reality (VR), users slip into a variety of roles, represented by a rich diversity of avatars that each exhibit specific visual attributes and motion styles. While users can see their avatar's motion in VR, they usually cannot feel it. To enhance avatar embodiment, we propose active proprioceptive feedback that aligns users' physical movements with the expected motion style of their avatar, for instance, by mimicking the avatar's weight, typical motion speed or motion range. We introduce a conceptual space of relevant motion properties which enable designers to create expressive proprioceptive motion styles for avatars. We instantiate this concept with MotionStyler: a system for designing customized motion styles and rendering them in real-time with an arm-based exoskeleton that is synchronized with the VR avatar. Results from a survey confirmed the expressiveness of the proposed conceptual space. A user study demonstrated the system's capability to create diverse proprioceptive motion styles which enhance user's self-identification with their avatar and thereby positively contribute to avatar embodiment in VR.2025DWDavid Wagmann et al.Force Feedback & Pseudo-Haptic WeightIdentity & Avatars in XRUIST
Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin StretchVirtual body extensions such as a wing or tail have the potential to offer users new bodily experiences and capabilities in virtual and augmented reality. To use these extensions as naturally as one’s own body—particularly for body parts that are normally hard to see, such as a tail—it is essential to provide proprioceptive feedback that allows users to perceive the position, orientation, and force exerted by these parts, rather than relying solely on visual cues. In this study, we propose a novel approach by introducing an "Imaginary Joint" at the interface between the user's actual body and the virtual extension, delivering information about joint flexion and force through skin-stretch feedback. We present a wearable device for skin-stretch feedback and explore informing mappings that convey the bending rotation and torque of the Imaginary Joint. The final system presents both types of information simultaneously by superimposing these skin deformations. Results from a controlled experiment with users demonstrate that users could identify tail position and force without relying on visual cues, and do so more effectively than in the vibrotactile condition. Furthermore, the tail was perceived as more embodied than in a vibrotactile condition, resulting in a more naturalistic and intuitive sensation. Finally, we introduce several application scenarios, including Perception of Extended Bodies, Enhanced Bodily Expression, and Body-Mediated Communication, and discuss the potential for future extensions of this system.2025STShuto Takashita et al.Haptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsDance & Body Movement ComputingUIST
eTactileKit: A Toolkit for Design Exploration and Rapid Prototyping of Electro-Tactile InterfacesElectro-tactile interfaces are becoming increasingly popular due to their unique advantages, such as delivering fast and localised tactile response, thin and flexible form factors, and the potential to create novel tactile experiences. However, insights from a formative study with typical designers highlighted the lack of resources, limited access to information and complexity of software and hardware tools. This establishes a high barrier to entry and limits the ability to rapidly prototype and experiment with electro-tactile interfaces. To address these challenges, we propose eTactileKit, a scalable and accessible toolkit providing end-to-end support for designing and prototyping electro-tactile interfaces. eTactileKit comprises a hardware platform and a software framework for designing, simulating and exploring electro-tactile stimuli. We evaluated the impact and usability of eTactileKit through a three-week long take-home study, which demonstrated increased accessibility, ease of use, and the toolkit's positive impact on design workflow. Additionally, we implemented a set of use cases to demonstrate the toolkit's practicality and effectiveness across various applications.2025PPPraneeth Bimsara Perera et al.Electrical Muscle Stimulation (EMS)Prototyping & User TestingUIST
Texergy: Textile-based Harvesting, Storing, and Releasing of Mechanical Energy for Passive On-Body ActuationHumans instinctively manipulate and "actuate" their clothing, for instance, to adapt to the environment or to modify aesthetics. However, such manual actuation remains inflexible and directly tied to user action. We introduce Texergy, a textile-based technical framework that decouples user input and actuated output to make passive on-body actuation interactive and programmable. Texergy achieves this by harvesting energy from user interactions with a set of input modules, storing it mechanically on the body in elastic materials, later releasing the energy on demand, and finally connecting to output end-effectors that realize the actuation. We present a fabrication approach based on almost entirely textile materials using laser-cutting and simple manual assembly to enable integration into clothing and easy prototyping. We report the results of technical experiments and provide a design tool to support customizing the actuation’s force and distance, type of harvesting, and deployment of Texergy mechanisms. We practically demonstrate the capabilities of Texergy with four applications, including a quick-release belt, a passive exosuit with dynamic assistance, a haptic feedback top powered by implicit user actions in VR, and a dance-driven shape-changing costume.2025YJYu Jiang et al.Force Feedback & Pseudo-Haptic WeightHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
GestureCoach: Rehearsing for Engaging Talks with LLM-Driven Gesture RecommendationsThis paper introduces GestureCoach, a system designed to help speakers deliver more engaging talks by guiding them to gesture effectively during rehearsal. GestureCoach combines an LLM-driven gesture recommendation model with a rehearsal interface that proactively cues speakers to gesture appropriately. Trained on experts’ gesturing patterns from TED talks, the model consists of two modules: an emphasis proposal module, which predicts when to gesture by identifying gesture-worthy text segments in the presenter notes, and a gesture identification module, which determines what gesture to use by retrieving semantically appropriate gestures from a curated gesture database. Results of a model performance evaluation and user study (N=30) show that the emphasis proposal module outperforms off-the-shelf LLMs in identifying suitable gesture regions, and that participants rated the majority of these predicted regions and their corresponding gestures as highly appropriate. A subsequent user study (N=10) showed that rehearsing with GestureCoach encouraged speakers to gesture and significantly increased gesture diversity, resulting in more engaging talks. We conclude with design implications for future AI-driven rehearsal systems.2025ARAshwin Ram et al.Hand Gesture RecognitionHuman-LLM CollaborationCreative Collaboration & Feedback SystemsUIST
ExoKit: A Toolkit for Rapid Prototyping of Interactions for Arm-based ExoskeletonsExoskeletons open up a unique interaction space that seamlessly integrates users' body movements with robotic actuation. Despite its potential, human-exoskeleton interaction remains an underexplored area in HCI, largely due to the lack of accessible prototyping tools that enable designers to easily develop exoskeleton designs and customized interactive behaviors. We present ExoKit, a do-it-yourself toolkit for rapid prototyping of low-fidelity, functional exoskeletons targeted at novice roboticists. ExoKit includes modular hardware components for sensing and actuating shoulder and elbow joints, which are easy to fabricate and (re)configure for customized functionality and wearability. To simplify the programming of interactive behaviors, we propose functional abstractions that encapsulate high-level human-exoskeleton interactions. These can be readily accessed either through ExoKit's command-line or graphical user interface, a Processing library, or microcontroller firmware, each targeted at different experience levels. Findings from implemented application cases and two usage studies demonstrate the versatility and accessibility of ExoKit for early-stage interaction design.2025MMMarie Muehlhaus et al.Saarland Informatics Campus, Saarland UniversityForce Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsCircuit Making & Hardware PrototypingCHI
3HANDS Dataset: Learning from Humans for Generating Naturalistic Handovers with Supernumerary Robotic LimbsSupernumerary robotic limbs are robotic structures integrated closely with the user's body, which augment human physical capabilities and necessitate seamless, naturalistic human-machine interaction. For effective assistance in physical tasks, enabling SRLs to hand over objects to humans is crucial. Yet, designing heuristic-based policies for robots is time-consuming, difficult to generalize across tasks, and results in less human-like motion. When trained with proper datasets, generative models are powerful alternatives for creating naturalistic handover motions. We introduce 3HANDS, a novel dataset of object handover interactions between a participant performing a daily activity and another participant enacting a hip-mounted SRL in a naturalistic manner. 3HANDS captures the unique characteristics of SRL interactions: operating in intimate personal space with asymmetric object origins, implicit motion synchronization, and the user’s engagement in a primary task during the handover. To demonstrate the effectiveness of our dataset, we present three models: one that generates naturalistic handover trajectories, another that determines the appropriate handover endpoints, and a third that predicts the moment to initiate a handover. In a user study (N=10), we compare the handover interaction performed with our method compared to a baseline. The findings show that our method was perceived as significantly more natural, less physically demanding, and more comfortable.2025AAArtin Saberpour Abadian et al.Saarland University, Saarland Informatics CampusTeleoperated DrivingHuman-Robot Collaboration (HRC)CHI
Embrogami: Shape-Changing Textiles with Machine EmbroideryMachine embroidery is a versatile technique for creating custom and entirely fabric-based patterns on thin and conformable textile surfaces. However, existing machine-embroidered surfaces remain static, limiting the interactions they can support. We introduce Embrogami, an approach for fabricating textile structures with versatile shape-changing behaviors. Inspired by origami, we leverage machine embroidery to form finger-tip-scale mountain-and-valley structures on textiles with customized shapes, bistable or elastic behaviors, and modular composition. The structures can be actuated by the user or the system to modify the local textile surface topology, creating interactive elements like toggles and sliders or textile shape displays with an ultra-thin, flexible, and integrated form factor. We provide a dedicated software tool and report results of technical experiments to allow users to flexibly design, fabricate, and deploy customized Embrogami structures. With four application cases, we showcase Embrogami’s potential to create functional and flexible shape-changing textiles with diverse visuo-tactile feedback.2024YJYu Jiang et al.Haptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
SoftBioMorph: Fabricating Sustainable Shape-changing Interfaces using Soft BiopolymersBio-based and bio-degradable materials have shown promising results for sustainable Human-Computer Interaction (HCI) applications, including shape-changing interfaces. However, the diversity of shape-changing behaviors achievable with these materials remains unclear as the fabrication knowledge is scattered across multiple research fields. This paper introduces SoftBioMorph, a fabrication framework that aims to integrate the fabrication know-how of sustainable soft shape-changing interfaces with biopolymers. Based on the example of Sodium Alginate, the framework contributes (1) a set of material synthesis processes that modify the biopolymer's properties to fulfill different functions; (2) a set of DIY crafting-based assembling techniques that functionalize the material and assembling properties to achieve three primitive types of change in shape; and (3) a series of application cases that demonstrate the versatility of the framework. We further discuss limitations, research questions, and fabrication challenges, presenting a comprehensive approach to sustainable prototyping in HCI.2024MNMadalina Nicolae et al.Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingSustainable HCIDIS
Flextiles: Designing Customisable Shape-Change in Textiles with SMA-Actuated Smocking PatternsShape Memory Alloys (SMAs) afford the seamless integration of shape-changing behaviour into textiles, enabling designers to augment apparel with dynamic shaping and styling. However, existing works fall short of providing versatile methods adaptable to varying scales, materials, and applications, curtailing designers’ capacity to prototype customised solutions. To address this, we introduce Flextiles, parameterised SMA design schema that leverage the traditional craft of smocking to integrate planar shape-change seamlessly into diverse textile projects. The conception of Flextiles stems from material experimentation and consultative dialogues with designers, whose insights inspired strategies for customising scale, elasticity, geometry, and actuation of Flextiles. To support the practical implementation of Flextiles, we provide a design tool and experimentally characterise their material properties. Lastly, through a design case study with practitioners, we explore the multifaceted applications and perspectives surrounding Flextiles, and subsequently realise four scenarios that illustrate the creative potential of these modular, customisable patterns.2024AHAlice C Haynes et al.Saarland Informatics CampusHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingCHI
Shaping Compliance: Inducing Haptic Illusion of Compliance in Different Shapes with Electrotactile GrainsCompliance, the degree of displacement under applied force, is pivotal in determining the material perception when touching an object. Vibrotactile actuators can be used for creating grain-based virtual compliance, but they have poor spatial resolution and a limiting rigid form factor. We propose a novel electrotactile compliance illusion that renders grains of electrical pulses on an electrode array in response to finger force changes. We demonstrate its ability to render compliance in distinct shapes through a thin, lightweight, and flexible finger-worn interface. Detailed technical parameters and the implementation of our device are provided. A controlled experiment confirms the technique can (1) create virtual compliance; (2) adjust the compliance magnitude with grain and electrode parameters; and (3) render compliance with specific shapes. In three example applications, we present how this illusion can enhance physical objects, elements in graphical user interfaces, and virtual reality experiences.2024AJArata Jingu et al.Saarland Informatics CampusVibrotactile Feedback & Skin StimulationHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsCHI
Biohybrid Devices: Prototyping Interactive Devices with Growable MaterialsLiving bio-materials are increasingly used in HCI for fabricating objects by growing. However, how to integrate electronics to make these objects interactive still needs to be clarified. This paper presents an exploration of the fabrication design space of Biohybrid Interactive Devices, a class of interactive devices fabricated by merging electronic components and living organisms. From the exploration of this space using bacterial cellulose, we outline a fabrication framework centered on the biomaterials‘ life cycle phases. We introduce a set of novel fabrication techniques for embedding conductive elements, sensors, and output components through biological (e.g. bio-fabrication and bio-assembling) and digital processes. We demonstrate the combinatory aspect of the framework by realizing three tangible, wearable, and shape-changing interfaces. Finally, we discuss the sustainability of our approach, its limitations, and the implications for bio-hybrid systems in HCI.2023MNMadalina Nicolae et al.Automated Driving Interface & Takeover DesignShape-Changing Interfaces & Soft Robotic MaterialsUIST
WRLKit: Computational Design of Personalized Wearable Robotic LimbsWearable robotic limbs (WRLs) augment human capabilities through robotic structures that attach to the user’s body. While WRLs are intensely researched and various device designs have been presented, it remains difficult for non-roboticists to engage with this exciting field. We aim to empower interaction designers and application domain experts to explore novel designs and applications by rapidly prototyping personalized WRLs that are customized for different tasks, different body locations, or different users. In this paper, we present WRLKit, an interactive computational design approach that enables designers to rapidly prototype a personalized WRL without requiring extensive robotics and ergonomics expertise. The body-aware optimization approach starts by capturing the user’s body dimensions and dynamic body poses. Then, an optimized fabricable structure of the WRL is generated for a desired mounting location and workspace of the WRL, to fit the user’s body and intended task. The results of a user study and several implemented prototypes demonstrate the practical feasibility and versatility of WRLKit.2023AAArtin Saberpour Abadian et al.Shape-Changing Interfaces & Soft Robotic MaterialsHuman-Robot Collaboration (HRC)UIST
Double-Sided Tactile Interactions for Grasping in Virtual RealityFor grasping, tactile stimuli to multiple fingertips are crucial for realistic shape rendering and precise manipulation. Pinching is particularly important in virtual reality since it is frequently used to grasp virtual objects. However, the interaction space of tactile feedback around pinching is underexplored due to a lack of means to provide co-located but different stimulation to finger pads. We propose a double-sided electrotactile device with a thin and flexible form factor to fit within pinched fingerpads, comprising two overlapping 3 × 3 electrode arrays. Using this new tactile interface, we define a new concept of double-sided tactile interactions with three feedback modes: (1) single-sided stimulation, (2) simultaneous double-sided stimulation, and (3) spatiotemporal double-sided stimulation. Through two user studies, we (1) demonstrate that participants can accurately discriminate between single-sided and double-sided stimulation and find a qualitative difference in tactile sensation; and (2) confirm the occurrence of apparent tactile motion between fingers and present optimal parameters for continuous or discrete movements. Based on these findings, we demonstrate five VR applications to exemplify how double-sided tactile interactions can produce spatiotemporal movement of a virtual object between fingers and enrich touch feedback for UI operation.2023AJArata Jingu et al.Mid-Air Haptics (Ultrasonic)Vibrotactile Feedback & Skin StimulationImmersion & Presence ResearchUIST
Handheld Tools Unleashed: Mixed-Initiative Physical Sketching with a Robotic PrinterPersonal fabrication has mostly focused on handheld tools as embodied extensions of the user, and machines like laser cutters and 3D printers automating parts of the process without intervention. Although interactive digital fabrication has been explored as a middle ground, existing systems have a fixed allocation of user intervention vs. machine autonomy, limiting flexibility, creativity, and improvisation. We explore a new class of devices that combine the desirable properties of a handheld tool and an autonomous fabrication robot, offering a continuum from manual and assisted to autonomous fabrication, with seamless mode transitions. We exemplify the concept of mixed-initiative physical sketching with a working robotic printer that can be handheld for free-hand sketching, can provide interactive assistance during sketching, or move about for computer-generated sketches. We present interaction techniques to seamlessly transition between modes, and sketching techniques benefitting from these transitions to, e.g., extend (upscale, repeat) or revisit (refine, color) sketches. Our evaluation with seven sketchers illustrates that RoboSketch successfully leverages each mode's strengths, and that mixed-initiative physical sketching makes computer-supported sketching more flexible.2023NPNarjes Pourjafarian et al.Saarland University, Saarland Informatics CampusDesktop 3D Printing & Personal FabricationLaser Cutting & Digital FabricationShape-Changing Materials & 4D PrintingCHI
I Need a Third Arm! Eliciting Body-based Interactions with a Wearable Robotic ArmWearable robotic arms (WRA) open up a unique interaction space that closely integrates the user's body with an embodied robotic collaborator. This space affords diverse interaction styles, including body movement, hand gestures, or gaze. Yet, it is so-far unexplored which commands are desirable from a user perspective. Contributing findings from an elicitation study (N=14), we provide a comprehensive set of interactions for basic robot control, navigation, object manipulation, and emergency situations, performed when hands are free or occupied. Our study provides insights into preferred body parts, input modalities, and the users' underlying sources of inspiration. Comparing interaction styles between WRAs and off-body robots, we highlight how WRAs enable a range of interactions specific for on-body robots and how users use WRAs both as tools and as collaborators. We conclude by providing guidance on the design of ad-hoc interaction with WRAs informed by user behavior.2023MMMarie Muehlhaus et al.Saarland Informatics CampusShape-Changing Interfaces & Soft Robotic MaterialsHuman-Robot Collaboration (HRC)CHI
Haptic Servos: Self-Contained Vibrotactile Rendering System for Creating or Augmenting Material ExperiencesWhen vibrations are synchronized with our actions, we experience them as material properties. This has been used to create virtual experiences like friction, counter-force, compliance, or torsion. Implementing such experiences is non-trivial, requiring high temporal resolution in sensing, high fidelity tactile output, and low latency. To make this style of haptic feedback more accessible to non-domain experts, we present Haptic Servos: self-contained haptic rendering devices which encapsulate all timing-critical elements. We characterize Haptic Servos’ real-time performance, showing the system latency is < 5 ms. We explore the subjective experiences they can evoke, highlighting that qualitatively distinct experiences can be created based on input mapping, even if stimulation parameters and algorithm remain unchanged. A workshop demonstrated that users new to Haptic Servos require approximately ten minutes to set up a basic haptic rendering system. Haptic Servos are open source, we invite others to copy and modify our design.2023NSNihar Sabnis et al.Max Planck Institute for Informatics, Saarland Informatics CampusVibrotactile Feedback & Skin StimulationForce Feedback & Pseudo-Haptic WeightCHI
Prototyping Soft Devices with Interactive BioplasticsDesigners and makers are increasingly interested in leveraging bio-based and bio-degradable 'do-it-yourself' (DIY) materials for sustainable prototyping. Their self-produced bioplastics possess compelling properties such as self-adhesion but have so far not been functionalized to create soft interactive devices, due to a lack of DIY techniques for the fabrication of functional electronic circuits and sensors. In this paper, we contribute a DIY approach for creating Interactive Bioplastics that is accessible to a wide audience, making use of easy-to-obtain bio-based raw materials and familiar tools. We present three types of conductive bioplastic materials and their formulation: sheets, pastes and foams. Our materials enable additive and subtractive fabrication of soft circuits and sensors. Furthermore, we demonstrate how these materials can substitute conventional prototyping materials, be combined with off-the-shelf electronics, and be fed into a sustainable material `life-cycle' including disassembly, re-use, and re-melting of materials. A formal characterization of our conductors highlights that they are even on-par with commercially available carbon-based conductive pastes.2022MKMarion Koelle et al.Shape-Changing Materials & 4D PrintingCircuit Making & Hardware PrototypingSustainable HCIUIST
Next Steps for Epidermal Computing: Opportunities and Challenges for Soft On-Skin DevicesSkin is a promising interaction medium and has been widely explored for mobile, and expressive interaction. Recent research in HCI has seen the development of Epidermal Computing Devices: ultra-thin and non-invasive devices which reside on the user's skin, offering intimate integration with the curved surfaces of the body, while having physical and mechanical properties that are akin to skin, expanding the horizon of on-body interaction. However, with rapid technological advancements in multiple disciplines, we see a need to synthesize the main open research questions and opportunities for the HCI community to advance future research in this area. By systematically analyzing Epidermal Devices contributed in the HCI community, physical sciences research and from our experiences in designing and building Epidermal Devices, we identify opportunities and challenges for advancing research across five themes. This multi-disciplinary synthesis enables multiple research communities to facilitate progression towards more coordinated endeavors for advancing Epidermal Computing.2022ANAditya Shekhar Nittala et al.Saarland Informatics Campus, University of CalgaryHaptic WearablesOn-Skin Display & On-Skin InputCHI
Print-A-Sketch: A Handheld Printer for Physical Sketching of Circuits and Sensors on Everyday SurfacesWe present Print-A-Sketch, an open-source handheld printer prototype for sketching circuits and sensors. Print-A-Sketch combines desirable properties from free-hand sketching and functional electronic printing. Manual human control of large strokes is augmented with computer control of fine detail. Shared control of Print-A-Sketch supports sketching interactive interfaces on everyday objects -- including many objects with materials or sizes which otherwise are difficult to print on. We present an overview of challenges involved in such a system and show how these can be addressed using context-aware, dynamic printing. Continuous sensing ensures quality prints by adjusting inking-rate to hand movement and material properties. Continuous sensing also enables the print to adapt to previously printed traces to support incremental and iterative sketching. Results show good conductivity on many materials and high spatial precision, supporting on-the-fly creation of functional interfaces.2022NPNarjes Pourjafarian et al.Saarland University, Saarland Informatics CampusCircuit Making & Hardware PrototypingCustomizable & Personalized ObjectsCHI