Motion-Coupled Asymmetric Vibration for Pseudo Force Rendering in Virtual RealityIn Virtual Reality (VR), rendering realistic forces is crucial for immersion, but traditional vibrotactile feedback fails to convey force sensations effectively. Studies of asymmetric vibrations that elicit pseudo forces show promise but are inherently tied to unwanted vibrations, reducing realism. Leveraging sensory attenuation to reduce the perceived intensity of self-generated vibrations during user movement, we present a novel algorithm that couples asymmetric vibrations with user motion, which mimics self-generated sensations. Our psychophysics study with 12 participants shows that motion-coupled asymmetric vibration attenuates the experience of vibration (equivalent to a \textasciitilde 30\% reduction in vibration-amplitude) while preserving the experience of force, compared to continuous asymmetric vibrations (state-of-the-art). We demonstrate the effectiveness of our approach in VR through three scenarios: shooting arrows, lifting weights, and simulating haptic magnets. Results revealed that participants preferred forces elicited by motion-coupled asymmetric vibration for tasks like shooting arrows and lifting weights. This research highlights the potential of motion-coupled asymmetric vibrations, offers new insights into sensory attenuation, and advances force rendering in VR.2025NSNihar Sabnis et al.Max Planck Institute for Informatics, Saarland Informatics Campus, Sensorimotor InteractionForce Feedback & Pseudo-Haptic WeightCHI
Who did it? How User Agency is influenced by Visual Properties of Generated ImagesThe increasing proliferation of AI and GenAI requires new interfaces tailored to how their specific affordances and human requirements meet. As GenAI is capable of taking over tasks from users on an unprecedented scale, designing the experience of agency -- if and how users experience control over the process and responsibility over the outcome -- is crucial. As an initial step towards design guidelines for shaping agency, we present a study that explores how features of AI-generated images influence users' experience of agency. We use two measures; temporal binding to implicitly estimate pre-reflective agency and magnitude estimation to assess user judgments of agency. We observe that abstract images lead to more temporal binding than images with semantic meaning. In contrast, the closer an image aligns with what a user might expect, the higher the agency judgment. When comparing the experiment results with objective metrics of image differences, we find that temporal binding results correlate with semantic differences, while agency judgments are better explained by local differences between images. This work contributes towards a future where agency is considered an important design dimension for GenAI interfaces.2024JDJohanna K. Didion et al.Generative AI (Text, Image, Music, Video)Explainable AI (XAI)UIST
vARitouch: Back of the Finger Device for Adding Variable Compliance to Rigid ObjectsWe present vARitouch, a back-of-the-finger wearable that can modify the perceived tactile material properties of the uninstrumented world around us: vARitouch can modulate the perceived softness of a rigid object through a vibrotactile compliance illusion. As vARitouch does not cover the fingertip, all-natural tactile properties are preserved. We provide three contributions: (1) We demonstrate the feasibility of the concept through a psychophysics study, showing that virtual compliance can be continuously modulated, and perceived softness can be increased by approximately 30 Shore A levels. (2) A qualitative study indicates the desirability of such a device, showing that a back-of-the-finger haptic device has many attractive qualities. (3) To implement vARitouch, we identify a novel way to measure pressure from the back of the finger by repurposing a pulse oximetry sensor. Based on these contributions, we present the finalized vARitouch system, accompanied by a series of application scenarios.2024GVGabriela Vega et al.Max Planck Institute for Informatics, Saarland Informatics CampusVibrotactile Feedback & Skin StimulationHaptic WearablesCHI
Shaping Compliance: Inducing Haptic Illusion of Compliance in Different Shapes with Electrotactile GrainsCompliance, the degree of displacement under applied force, is pivotal in determining the material perception when touching an object. Vibrotactile actuators can be used for creating grain-based virtual compliance, but they have poor spatial resolution and a limiting rigid form factor. We propose a novel electrotactile compliance illusion that renders grains of electrical pulses on an electrode array in response to finger force changes. We demonstrate its ability to render compliance in distinct shapes through a thin, lightweight, and flexible finger-worn interface. Detailed technical parameters and the implementation of our device are provided. A controlled experiment confirms the technique can (1) create virtual compliance; (2) adjust the compliance magnitude with grain and electrode parameters; and (3) render compliance with specific shapes. In three example applications, we present how this illusion can enhance physical objects, elements in graphical user interfaces, and virtual reality experiences.2024AJArata Jingu et al.Saarland Informatics CampusVibrotactile Feedback & Skin StimulationHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsCHI
Tactile Symbols with Continuous and Motion-Coupled Vibration: An Exploration of using Embodied Experiences for Hermeneutic DesignWith most digital devices, vibrotactile feedback consists of rhythmic patterns of continuous vibration. In contrast, when interacting with physical objects, we experience many of their material properties through vibration which is not continuous, but dynamically coupled to our actions. We assume the first style of vibration to lead to hermeneutic mediation, while the second style leads to embodied mediation. What if both types of mediation could be used to design tactile symbols? To investigate this, five haptic experts designed tactile symbols using continuous and motion-coupled vibration. Experts were interviewed to understand their symbols and design approach. A thematic analysis revealed themes showing that lived experience and affective qualities shaped design choices, that participants optimized for passive or active symbols, and that participants considered context as part of the design. Our study suggests that adding embodied experiences as a design resource changes how participants think of tactile symbol design, thus broadening the scope of the symbol by design for context, and expanding their affective repertoire as changing the type of vibration influences perceived valence and arousal.2023NSNihar Sabnis et al.Max Planck Institute for Informatics, Saarland Informatics CampusVibrotactile Feedback & Skin StimulationShape-Changing Interfaces & Soft Robotic MaterialsHand Gesture RecognitionCHI
Negotiating Experience and Communicating Information Through Abstract MetaphorAn implicit assumption in metaphor use is that it requires grounding in a familiar concept, prominently seen in the popular Desktop Metaphor. In human-to-human communication, however, abstract metaphors, without such grounding, are often used with great success. To understand when and why metaphors work, we present a case study of metaphor use in voice teaching. Voice educators must teach about subjective, sensory experiences and rely on abstract metaphor to express information about unseen and intangible processes inside the body. We present a thematic analysis of metaphor use by 12 voice teachers. We found that metaphor works not because of strong grounding in the familiar, but because of its ambiguity and flexibility, allowing shared understanding between individual lived experiences. We summarise our findings in a model of metaphor-based communication. This model can be used as an analysis tool within the existing taxonomies of metaphor in user interaction for better understanding why metaphor works in HCI. It can also be used as a design resource for thinking about metaphor use and abstracting metaphor strategies from both novel and existing designs.2023CRCourtney N. Reed et al.Max Planck Institute for Informatics, Saarland Informatics Campus, Queen Mary University of LondonUser Research Methods (Interviews, Surveys, Observation)Interactive Narrative & Immersive StorytellingCHI
Haptic Servos: Self-Contained Vibrotactile Rendering System for Creating or Augmenting Material ExperiencesWhen vibrations are synchronized with our actions, we experience them as material properties. This has been used to create virtual experiences like friction, counter-force, compliance, or torsion. Implementing such experiences is non-trivial, requiring high temporal resolution in sensing, high fidelity tactile output, and low latency. To make this style of haptic feedback more accessible to non-domain experts, we present Haptic Servos: self-contained haptic rendering devices which encapsulate all timing-critical elements. We characterize Haptic Servos’ real-time performance, showing the system latency is < 5 ms. We explore the subjective experiences they can evoke, highlighting that qualitatively distinct experiences can be created based on input mapping, even if stimulation parameters and algorithm remain unchanged. A workshop demonstrated that users new to Haptic Servos require approximately ten minutes to set up a basic haptic rendering system. Haptic Servos are open source, we invite others to copy and modify our design.2023NSNihar Sabnis et al.Max Planck Institute for Informatics, Saarland Informatics CampusVibrotactile Feedback & Skin StimulationForce Feedback & Pseudo-Haptic WeightCHI
Print-A-Sketch: A Handheld Printer for Physical Sketching of Circuits and Sensors on Everyday SurfacesWe present Print-A-Sketch, an open-source handheld printer prototype for sketching circuits and sensors. Print-A-Sketch combines desirable properties from free-hand sketching and functional electronic printing. Manual human control of large strokes is augmented with computer control of fine detail. Shared control of Print-A-Sketch supports sketching interactive interfaces on everyday objects -- including many objects with materials or sizes which otherwise are difficult to print on. We present an overview of challenges involved in such a system and show how these can be addressed using context-aware, dynamic printing. Continuous sensing ensures quality prints by adjusting inking-rate to hand movement and material properties. Continuous sensing also enables the print to adapt to previously printed traces to support incremental and iterative sketching. Results show good conductivity on many materials and high spatial precision, supporting on-the-fly creation of functional interfaces.2022NPNarjes Pourjafarian et al.Saarland University, Saarland Informatics CampusCircuit Making & Hardware PrototypingCustomizable & Personalized ObjectsCHI
BodyStylus: Freehand On-Skin Design and Fabrication of Epidermal InterfacesIn traditional body-art, designs are adjusted to the body as they are applied, enabling creative improvisation and exploration. Conventional design and fabrication methods of epidermal interfaces, however, separate these steps. With BodyStylus we present the first computer-assisted approach for on-skin design and fabrication of epidermal interfaces. Inspired by traditional techniques, we propose a hand-held tool that augments freehand inking with digital support: projected in-situ guidance assists creating valid on-body circuits and aesthetic ornaments that align with the human bodyscape, while pro-active switching between inking and non-inking creates error preventing constraints. We contribute BodyStylus' design rationale and interaction concept along with an interactive prototype that uses self-sintering conductive ink. Results of two focus group explorations showed that guidance was more appreciated by artists, while constraints appeared more useful to engineers, and that working on the body inspired critical reflection on the relationship between bodyscape, interaction, and designs.2021NPNarjes Pourjafarian et al.Saarland University, Saarland Informatics CampusHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsOn-Skin Display & On-Skin InputCHI
Squish This: Force Input on Soft Surfaces for Visual Targeting TasksToday's typical input device is flat, rigid and made of glass. However, advances in sensing technology and interaction design suggest thinking about input on other surface, including soft materials. While touching rigid and soft materials might feel similar, they clearly feel different when pressure is applied to them. Yet, to date, studies only investigated force input on rigid surfaces. We present a first systematic evaluation of the effects of compliance on force input. Results of a visual targeting task for three levels of softness indicate that high force levels appear more demanding for soft surfaces, but that performance is otherwise similar. Performance remained very high (~ for 20 force levels) regardless of the compliance, suggesting force input was underestimated so far. We infer implications for the design of force input on soft surfaces and conclude that interaction models used on rigid surfaces might be used on soft surfaces.2021BFBruno Fruchard et al.Saarland Informatics CampusShape-Changing Interfaces & Soft Robotic MaterialsCHI
Eyecam: Revealing Relations between Humans and Sensing Devices through an Anthropomorphic WebcamWe are surrounded by sensing devices. We are accustomed to them, appreciate their benefits, and even create affective bonds and might neglect the implications they might have for our daily life. By presenting Eyecam, an anthropomorphic webcam mimicking a human eye, we challenge conventional relationships with ubiquitous sensing devices and call to re-think how sensing devices might appear and behave. Inspired by critical design, Eyecam is an exaggeration of a familiar sensing device which allows for critical reflections on its perceived functionalities and its impact on human-human and human-device relations. We identify 5 different roles Eyecam can take: Mediator, Observer, Mirror, Presence, and Agent. Contributing design fictions and thinking prompt, we allow for articulation on privacy awareness and intrusion, affect in mediated communication, agency and self-perception along with speculation on potential futures. We envision this work to contribute to a bold and responsible design of ubiquitous sensing devices.2021MTMarc Teyssier et al.Saarland Informatics CampusTechnology Ethics & Critical HCIParticipatory DesignCHI
bARefoot: Generating Virtual Materials using Motion Coupled Vibration in ShoesMany features of materials can be experienced through tactile cues, even using one’s feet. For example, one can easily distinguish between moss and stone without looking at the ground. However, this type of material experience is largely not supported in AR and VR applications. We present bARefoot, a prototype shoe providing tactile impulses tightly coupled to motor actions. This enables generating virtual material experiences such as compliance, elasticity, or friction. To explore the parameter space of such sensorimotor coupled vibrations, we present a design tool enabling rapid design of virtual materials. We report initial explorations to increase understanding of how parameters can be optimized for generating compliance, and to examine the effect of dynamic parameters on material experiences. Finally, we present a series of use cases that demonstrate the potential of bARefoot for VR and AR.2020PSPaul Strohmeier et al.Vibrotactile Feedback & Skin StimulationFoot & Wrist InteractionUIST
PolySense: Augmenting Textiles with Electrical Functionality using In-Situ PolymerizationWe present a method for enabling arbitrary textiles to sense pressure and deformation: In-situ polymerization supports integration of piezoresistive properties at the material level, preserving a textile's haptic and mechanical characteristics. We demonstrate how to enhance a wide set of fabrics and yarns using only readily available tools. To further support customisation by the designer, we present methods for patterning, as needed to create circuits and sensors, and demonstrate how to combine areas of different conductance in one material. Technical evaluation results demonstrate the performance of sensors created using our method is comparable to off-the-shelf piezoresistive textiles. As application examples, we demonstrate rapid manufacturing of on-body interfaces, tie-dyed motion-capture clothing, and zippers that act as potentiometers.2020CHCedric Honnet et al.Massachusetts Institute of TechnologyShape-Changing Interfaces & Soft Robotic MaterialsElectronic Textiles (E-textiles)CHI