Vestibular Stimulation Enhances Hand RedirectionWe demonstrate how the vestibular system (i.e., the sense of balance) influences the perception of hand position in VR. By exploiting this via galvanic vestibular stimulation (GVS), we can enhance the degree to which we can redirect the user’s hands in VR without them noticing, i.e., raising the detection threshold of hand redirection. Our novel cross-modal illusion relies on the principle that a GVS-induced subtle body sway aligns with the user’s expected body balance during hand redirection. This alignment reduces the sensory conflict between the expected and actual body balance, allowing for a larger hand redirection than would normally be noticed. In our user study, we validated that our approach raises the detection threshold of VR hand redirection by approximately 55% for outward and 45% for inward movements. With this increase, our approach broadens the applicability of hand redirection (e.g., compressing a VR space into an even smaller physical area).2025KKKensuke Katori et al.Force Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsFull-Body Interaction & Embodied InputUIST
Primed Action: Preserving Agency while Accelerating Reaction Time via Subthreshold Brain StimulationWhile prior work in neuroscience confirmed that transcranial magnetic stimulation (TMS) can shorten the onset of muscle activity, the implications of this reaction-time speedup have not been explored in interactive systems. We present Primed Action, a novel interface concept that leverages this type of TMS-based faster reactions. What sets Primed Action apart from prior work that uses muscle-stimulation to “force” faster reactions is that our approach operates below the threshold of movement—it does not trigger involuntary motion, but instead it “primes” neurons in the motor cortex by enhancing their neural excitability. As we found in our study, Primed Action best preserved participants’ sense of agency than existing interactive approaches based on muscle stimulation (e.g., Preemptive Action). We believe this novel insight allows HCI researchers to implement new forms of haptic assistance that do not sacrifice agency, which we demonstrate in a set of interactive experiences (e.g., VR sports training).2025YTYudai Tanaka et al.Vibrotactile Feedback & Skin StimulationBrain-Computer Interface (BCI) & NeurofeedbackVR Medical Training & RehabilitationUIST
VR Side-Effects: Memory & Proprioceptive Discrepancies After Leaving Virtual Reality Our brain’s plasticity rapidly adapts our senses in VR, a phenomenon leveraged by techniques such as redirected-walking, hand-redirection, etc. However, while most of HCI is interested in how users adapt to VR, we turn our attention to how users need to adapt their senses when returning to the real-world. We report cases where, even after leaving VR, users experience unintended, lingering side-effects: distortions in proprioception or memory that may pose safety or usability risks. To investigate, we conducted two studies examining (1) proprioceptive side-effects from altered hand movements (retargeting), and (2) memory distortions arising from spatial mismatches between the virtual and real-world locations of the same object. We found that, after leaving VR, (1) participants’ hands remained redirected by up to 7 cm, indicating residual proprioceptive distortion; and (2) participants incorrectly recalled the virtual location of objects rather than their actual real-world locations (e.g., remembering the location of a VR-extinguisher, even when trying to recall the real one). Finally, we discuss the implications of these findings for VR and propose a call-to-action for a deeper study of these side-effects within HCI.2025ACAntonin Cheymol et al.Full-Body Interaction & Embodied InputImmersion & Presence ResearchUIST
ProtoPCB: Reclaiming Printed Circuit Board E-waste as Prototyping MaterialWe propose an interactive tool that enables reusing printed circuit boards (PCB) as prototyping materials to implement new circuits — this extends the utility of PCBs rather than discards them as e-waste. To enable this, our tool takes a user’s desired circuit schematic and analyzes its components and connections to find methods of creating the user’s circuit on discarded PCBs (e.g., e-waste, old prototypes). In our technical evaluation, we utilized our tool across a diverse set of PCBs and input circuits to characterize how often circuits could be implemented on a different board, implemented with minor interventions (trace-cutting or bodge-wiring), or implemented on a combination of multiple boards — demonstrating how our tool assists with exhaustive matching tasks that a user would not likely perform manually. We believe our tool offers: (1) a new approach to prototyping with electronics beyond the limitations of breadboards and (2) a new approach to reducing e-waste during electronics prototyping.2025JLJasmine Lu et al.University of ChicagoCircuit Making & Hardware PrototypingSustainable HCICHI
Seeing with the Hands: A Sensory Substitution That Supports Manual InteractionsSensory-substitution devices enable perceiving objects by translating one modality (e.g., vision) into another (e.g., tactile). While many explored the placement of the haptic-output (e.g., torso, forehead), the camera’s location remains largely unexplored—typically seeing from the eyes’ perspective. Instead, we propose that seeing & feeling information from the hands’ perspective could enhance flexibility & expressivity of sensory-substitution devices to support manual interactions with physical objects. To this end, we engineered a back-of-the-hand electrotactile-display that renders tactile images from a wrist-mounted camera, allowing the user’s hand to feel objects while reaching & hovering. We conducted a study with sighted/Blind-or-Low-Vision participants who used our eyes vs. hand tactile-perspectives to manipulate bottles and soldering-irons, etc. We found that while both tactile perspectives provided comparable performance, when offered the opportunity to choose, all participants found value in also using the hands’ perspective. Moreover, we observed behaviors when “seeing with the hands” that suggest a more ergonomic object-manipulation. We believe these insights extend the landscape of sensory-substitution devices.2025STShan-Yuan Teng et al.University of ChicagoIn-Vehicle Haptic, Audio & Multimodal FeedbackVibrotactile Feedback & Skin StimulationVisual Impairment Technologies (Screen Readers, Tactile Graphics, Braille)CHI
Adaptive Electrical Muscle Stimulation Improves Muscle MemoryElectrical muscle stimulation (EMS) has been leveraged to assist in learning motor skills by actuating the user’s muscles. However, existing systems provide static demonstration—actuating the correct movements, regardless of the user’s learning progress. Instead, we contrast two versions of a piano-tutoring system: a conventional EMS setup that moves the participant’s fingers to play the sequence of movements correctly, and a novel adaptive-EMS system that changes its guidance strategy based on the participant’s performance. The adaptive-EMS dynamically adjusts its guidance: (1) demonstrate by playing the entire sequence when errors are frequent; (2) correct by lifting incorrect fingers and actuating the correct one when errors are moderate; and (3) warn by lifting incorrect fingers when errors are low. We found that adaptive-EMS improved learning outcomes (recall) and was preferred by participants. We believe this approach could inspire new types of physical tutoring systems that promote adaptive over static guidance.2025SCSiya Choudhary et al.University of ChicagoElectrical Muscle Stimulation (EMS)Motor Impairment Assistive Input TechnologiesCHI
Power-on-Touch: Powering Actuators, Sensors, and Devices during InteractionWe introduce Power-on-Touch, a novel method for powering devices during interaction. Power-on-Touch comprises two main components: (1) a wearable-transmitter attached to the user’s body (e.g., fingernail, back of the hand, feet) with wireless power-coils and a battery; and (2) receiver-tags embedded in interactive devices, making them battery-free. Many devices only require power during interaction (e.g., TV remotes, digital calipers). We leverage this interactive opportunity by inductively transferring energy from the user’s coil to the device’s coil when in close proximity. To achieve this, we engineered receiver-tags and coils, including thin pancake-coils best-suited for wearables and spherical-coils that receive power omnidirectionally. To understand which coils best support a wide range of interactions (e.g., grasping, touching, hovering), we performed technical characterizations, including impedance and 3D efficiency analysis. We believe our technical approach can inspire ubiquitous computing with new ways to scale up the number and diversity of battery-free devices, not just sensors (µWatts) but also actuators (Watts).2025AMAlex Mazursky et al.University of ChicagoUbiquitous ComputingCircuit Making & Hardware PrototypingCHI
Can a Smartwatch Move Your Fingers? Compact and Practical Electrical Muscle Stimulation in a SmartwatchSmartwatches gained popularity in the mainstream, making them into today’s de-facto wearables. Despite advancements in sensing, haptics on smartwatches is still restricted to tactile feedback (e.g., vibration). Most smartwatch-sized actuators cannot render strong force-feedback. Simultaneously, electrical muscle stimulation (EMS) promises compact force-feedback but, to actuate fingers requires users to wear many electrodes on their forearms. While forearm electrodes provide good accuracy, they detract EMS from being a practical force-feedback interface. To address this, we propose moving the electrodes to the wrist—conveniently packing them in the backside of a smartwatch. In our first study, we found that by cross-sectionally stimulating the wrist in 1,728 trials, we can actuate thumb extension, index extension & flexion, middle flexion, pinky flexion, and wrist flexion. Following, we engineered a compact EMS that integrates directly into a smartwatch’s wristband (with a custom stimulator, electrodes, demultiplexers, and communication). In our second study, we found that participants could calibrate our device by themselves ~50% faster than with conventional EMS. Furthermore, all participants preferred the experience of this device, especially for its social acceptability & practicality. We believe that our approach opens new applications for smartwatch-based interactions, such as haptic assistance during everyday tasks.2024ATAkifumi Takahashi et al.Electrical Muscle Stimulation (EMS)Smartwatches & Fitness BandsUIST
Augmented Breathing via Thermal Feedback in the NoseWe propose, engineer, and study a novel method to augment the feeling of breathing—enabling interactive applications to let users feel like they are inhaling more/less air (perceived nasal airflow). We achieve this effect by cooling or heating the nose in sync with the user’s inhalation. Our illusion builds on the physiology of breathing: we perceive our breath predominantly through the cooling of our nasal cavities during inhalation. This is why breathing in a “fresh” cold environment feels easier than in a “stuffy” hot environment, even when the inhaled volume is the same. Our psychophysical study confirmed that our in-nose temperature stimulation significantly influenced breathing perception in both directions: making it feel harder & easier to breathe. Further, we found that ~90% of the trials were described as a change in perceived airflow/breathing, while only ~8% as temperature. Following, we engineered a compact device worn across the septum that uses Peltier elements. We illustrate the potential of this augmented breathing in interactive contexts, such as for virtual reality (e.g., rendering ease of breathing crisp air or difficulty breathing with a deteriorated gas mask) and everyday interactions (e.g., in combination with a relaxation application or to alleviate the perceived breathing resistance when wearing a mask).2024JBJas Brooks et al.VR Medical Training & RehabilitationBiosensors & Physiological MonitoringUIST
Designing Plant-Driven Actuators for Robots to Grow, Age, and DecayDesigning plant-driven actuators presents an opportunity to create new types of devices that grow, age, and decay, such as robots that embody these qualities in their physical structure. Plant-robot hybrids that grow and decay incorporate unpredictable and gradual transformations inherent across living organisms and suggest an alternative to the design principles of immediacy, responsiveness, control, accuracy, and durability commonly found in robotic design. To explore this, we present a design space of primitives for plant-driven robotic actuators. Proof-of-concept prototypes illustrate how concepts like slow change, slow movement, decay, and destruction can be incorporated into robotic forms. We describe the design considerations required for building plant-driven actuators for robots, including experimental findings regarding the mechanical properties of plant forces. Finally, we speculate on the potential benefits of plant-robot hybrids to interactive domains such as robotics.2024YHYuhan Hu et al.Shape-Changing Interfaces & Soft Robotic MaterialsDIS
Stick&Slip: Altering Fingerpad Friction via Liquid CoatingsWe present Stick&Slip, a novel approach that alters friction between the fingerpad & surfaces by depositing liquid droplets that coat the fingerpad. The liquid coating modifies the finger’s coefficient of friction, allowing users to feel surfaces up to ±60% more slippery or sticky. We selected our fluids to rapidly evaporate so that the surface returns to its original friction. Unlike traditional friction-feedback, such as electroadhesion or vibration, our approach: (1) alters friction on a wide range of surfaces and geometries, making it possible to modulate nearly any non-absorbent surface; (2) scales to many objects without requiring instrumenting the target surfaces (e.g., with conductive electrode coatings or vibromotors); and (3) both in/decreases friction via a single device. We identified nine liquids and characterized their practicality by measuring evaporation rates, etc. To illustrate the applicability of our approach, we demonstrate how it enables friction in virtual/mixed-reality or, even, while using everyday objects/tools.2024AMAlex Mazursky et al.University of ChicagoVibrotactile Feedback & Skin StimulationShape-Changing Interfaces & Soft Robotic MaterialsCHI
Haptic Source-effector: Full-body Haptics via Non-invasive Brain StimulationWe propose a novel concept for haptics in which one centralized on-body actuator renders haptic effects on multiple body parts by stimulating the brain, i.e., the source of the nervous system—we call this a haptic source-effector, as opposed to the traditional wearables’ approach of attaching one actuator per body part (end-effectors). We implement our concept via transcranial-magnetic-stimulation (TMS)—a non-invasive technique from neuroscience/medicine in which electromagnetic pulses safely stimulate brain areas. Our approach renders ~15 touch/force-feedback sensations throughout the body (e.g., hands, arms, legs, feet, and jaw—which we found in our first user study), all by stimulating the user’s sensorimotor cortex with a single magnetic coil moved mechanically across the scalp. In our second user study, we probed into participants’ experiences while using our haptic display in VR. Finally, as the first implementation of full-body haptics based on non-invasive brain stimulation, we discuss the roadmap to extend its interactive opportunities.2024YTYudai Tanaka et al.University of ChicagoElectrical Muscle Stimulation (EMS)Brain-Computer Interface (BCI) & NeurofeedbackCHI
SplitBody: Reducing Mental Workload while Multitasking via Muscle StimulationTechniques like electrical muscle stimulation (EMS) offer promise in assisting physical tasks by automating movements, e.g., shaking a spray-can or tapping a button. However, existing actuation systems improve the performance of a task that users are already focusing on (e.g., users are already focused on using the spray-can). Instead, we investigate whether these interactive-actuation systems (e.g., EMS) offer any benefits if they automate a task that happens in the background of the user's focus. Thus, we explored whether automating a repetitive movement via EMS would reduce mental workload while users perform parallel tasks (e.g., focusing on writing an essay while EMS stirs a pot of soup). In our study, participants performed a cognitively-demanding multitask aided by EMS (SplitBody condition) or performed by themselves (baseline). We found that with SplitBody performance increased (35% on both tasks, 18% on the non-EMS-automated task), physical-demand decreased (31%), and mental-workload decreased (26%).2024RNRomain Nith et al.University of ChicagoElectrical Muscle Stimulation (EMS)CHI
Haptic Permeability: Adding Holes to Tactile Devices Improves DexterityFeeling haptics with our fingerpads is how we achieve manual tasks (e.g., operate a needle or press buttons). Following this, research started adding actuators atop the users’ fingerpads to render haptic feedback for interactive virtual environments. Recently, many have moved away from thick actuators (e.g., vibration motors) and turned to electrode-films with electrotactile stimulation—allowing users to still feel some sensations through the devices when touching physical objects (e.g., compliance or some macro features). However, we argue & demonstrate that thin devices are not enough to maximize the user’s dexterity. We evaluate how adding small holes to electrotactile films can allow direct contact and thus increase haptic permeability, resulting in: (1) improved perception of tactile features; and (2) improved force control in grasping tasks. Finally, we observed participants in interactive experiences and found that holes can preserve dexterity with physical tasks while still benefiting from haptic feedback.2024STShan-Yuan Teng et al.University of ChicagoVibrotactile Feedback & Skin StimulationForce Feedback & Pseudo-Haptic WeightCHI
Affective Touch as Immediate and Passive Wearable InterventionWe investigated affective touch as a new pathway to passively mitigate in-the-moment anxiety. While existing mobile interventions offer great promises for health and well-being, they typically focus on achieving long-term effects such as shifting behaviors. As such, most mobile interventions are not applicable to provide immediate help in acute conditions -- when a user experiences a high anxiety level during ongoing events (e.g., completing high-stake tasks or mitigating interpersonal conflicts). A few works have developed passive interventions that are effective in-the-moment by leveraging breathing regulations and biofeedback. In this paper, we drew on neuroscientific findings on affective touch, the slow stroking on hairy skin that can elicit innate pleasantness and evaluated affective touch as a mobile health intervention. To induce affective touch, we first engineered a wearable device that renders a soft stroking sensation on the user's forearm. Then, we conducted a between-group experiment, in which participants underwent high-stress situations with/without receiving affective touch and post-experiment interviews, with 24 participants. Our results showed that participants who received affective touch experienced lower state anxiety and the same physiological stress response level compared to the control group participants. We also found that affective touch facilitated emotion regulation by rendering pleasantness, providing emotional support, and shifting attention. Finally, we discussed the immediate effect of affective touch on anxiety and physiological stress, the benefits of affective touch as a passive intervention, and the implementation considerations to use affective touch in just-in-time systems. https://dl.acm.org/doi/10.1145/35694842023YZYiran Zhao et al.Haptic WearablesSleep & Stress MonitoringUbiComp
Interactive Benefits from Switching Electrical to Magnetic Muscle StimulationElectrical muscle stimulation (EMS) became a popular method for force-feedback without mechanical-actuators. While much has been written about the advantages of EMS, not much work has investigated circumventing its key limitations: (1) as impulses traverse the skin, they cause an uncomfortable “tingling”; (2) impulses are delivered via gelled-electrodes, which not only require direct skin contact (must be worn under clothes); but, also (3) dry up after a few hours. To tackle these, we explore switching from electrical to magnetic muscle stimulation (MMS), via electromagnetic fields generated by coils. The first advantage is that MMS coils do not require direct skin contact and can actuate up to 5 cm away (Study#1)—this enables applications not possible with EMS, such as stimulation over the clothes and without ever replacing electrodes. Second, and more important, MMS results in ~50 % less discomfort caused by tingling than EMS (Study#2). We found that reducing this tingling discomfort has two downstream effects for interactive systems: (1) participants rated MMS force-feedback as more realistic than that of EMS (Study#3); and (2) participants could more accurately perceive the pose actuated by the interactive system (Study#4). Finally, we demonstrated applications where our proposed switch from EMS to MMS improves user experience, including for VR feedback, gaming, and pose-control.2023YTYudai Tanaka et al.Force Feedback & Pseudo-Haptic WeightElectrical Muscle Stimulation (EMS)Full-Body Interaction & Embodied InputUIST
ThermalRouter: Enabling Users to Design Thermally-Sound Devices Users often 3D model enclosures that interact with significant heat sources, such as electronics or appliances that generate heat (e.g., CPU, motor, lamps, etc.). While parts made by users might function well aesthetically or structurally, they are rarely thermally-sound. This happens because heat transfer is non-intuitive; thus, engineering thermal solutions is not straightforward. To tackle this, we developed ThermalRouter, a CAD plugin that assists with improving the thermal performance of their models. ThermalRouter automatically converts regions of the model to be made from thermally-conductive materials (such as nylon or metallic-silicone). These regions act as heat channels, branching away from hotspots to dissipate heat. The key is that ThermalRouter automatically simulates the thermal performance of many possible heat channel configurations and presents the user with the most thermally-sound design (e.g., lowest temperature). Furthermore, it allows users to customize by balancing costs, indicating non-modifiable geometry, etc. Most importantly, ThermalRouter achieves this without requiring manual labor to set up or parse the results of complex thermal simulations.2023AMAlex Mazursky et al.Laser Cutting & Digital FabricationCircuit Making & Hardware PrototypingCustomizable & Personalized ObjectsUIST
ecoEDA: Recycling E-waste During Electronics DesignThe amount of e-waste generated by discarding devices is enormous but options for recycling remain limited. However, inside a discarded device (from consumer devices to one’s own prototypes), an electronics designer could find dozens to thousands of reusable components, including microcontrollers, sensors, voltage regulators, etc. Despite this, existing electronic design tools assume users will buy all components anew. To tackle this, we propose ecoEDA, an interactive tool that enables electronics designers to explore recycling electronic components during the design process. We accomplish this via (1) creating suggestions to assist users in identifying and designing with recycled components; and (2) maintaining a library of useful data relevant to reuse (e.g., allowing users to find which devices contain which components). Through example use-cases, we demonstrate how our tool can enable various pathways to recycling e-waste. To evaluate it, we conducted a user study where participants used our tool to create an electronic schematic with components from torn-down e-waste devices. We found that participants’ designs made with ecoEDA featured an average of 66% of recycled components. Last, we reflect on challenges and opportunities for building software that promotes e-waste reuse.2023JLJasmine Lu et al.Desktop 3D Printing & Personal FabricationEcological Design & Green ComputingUIST
FeetThrough: Electrotactile Foot Interface that Preserves Real-World SensationsHaptic interfaces have been extended to the feet to enhance foot-based activities, such as guidance while walking or stepping on virtual textures. Most feet haptics use mechanical actuators, namely vibration motors. However, we argue that vibration motors are not the ideal actuators for all feet haptics. Instead, we demonstrate that electrotactile stimulation provides qualities that make it a powerful feet-haptic interface: (1) Users wearing electrotactile can not only feel the stimulation but can also better feel the terrain under their feet—this is critical as our feet are also responsible for the balance on uneven terrains and stairs—electrotactile achieves this improved “feel-through” effect because it is thinner than vibrotactile actuators, at 0.1 mm in our prototype; (2) While a single vibrotactile actuator will also vibrate surrounding skin areas, we found improved two-point discrimination thresholds for electrotactile; (3) Electrotactile can be applied directly to soles, insoles or socks, enabling new applications such as barefoot interactive experiences or without requiring users to have custom-shoes with built-in vibration motors. Finally, we demonstrate applications in which electrotactile feet interfaces allow users to feel not only virtual information but also the real terrain under their shoes, such as a VR experience where users walk on ground props and a tactile navigation system that augments the ground with virtual tactile paving to assist pedestrians in low-vision situations.2023KUKeigo Ushiyama et al.Vibrotactile Feedback & Skin StimulationFoot & Wrist InteractionUIST
Taste Retargeting via Chemical Taste ModulatorsPrior research has explored modifying taste through electrical stimulation. While promising, such interfaces often only elicit taste changes while in contact with the user’s tongue (e.g., cutlery with electrodes), making them incompatible with eating and swallowing real foods. Moreover, most interfaces cannot selectively alter basic tastes, but only the entire flavor profile (e.g., cannot selectively alter bitterness). To tackle this, we propose taste retargeting, a method of altering taste perception by delivering chemical modulators to the mouth before eating. These modulators temporarily change the response of taste receptors to foods, selectively suppressing or altering basic tastes. Our first study identified six accessible taste modulators that suppress salty, umami, sweet, or bitter and transform sour into sweet. Using these findings, we demonstrated an interactive application of this technique with the example of virtual reality, which we validated in our second study. We found that taste retargeting reduced the flavor mismatch between a food prop and other virtual foods.2023JBJas Brooks et al.Immersion & Presence ResearchFood Culture & Food InteractionUIST