Exploring the Remapping Impact of Spatial Head-hand Relations in Immersive TelesurgeryThe action remapping between the user and the avatar creates significant perceptual and behavioral challenges. Recently, in addition to virtual environments, remapping has also given rise to new applications—immersive teleoperated robots. This paper selects immersive telesurgery, a representative scenario, as an opportunity for research, exploring the generalized effects of remapping. In such a scenario, the operator can observe through the robot's camera and use their hands to control the robotic arms, as if they were the robot. However, common remapping of spatial head-hand relations—due to camera adjustments and robotic arm switching—creates significant visual-proprioceptive conflicts and physical limitations. To explore this, we simulated a telesurgery system with 6 head-camera and 12 hand-robotic-arm remapping conditions, assessing non-surgeon participants across four surgical tasks: navigation, location, cutting, and bimanual coordination. The study examines spatial perception bias, interaction deviation, workload, and task completion time. Our findings reveal how different remapping targets, attributes, intensities, and situations affect performance, contributing to the understanding of perception mechanisms and offering insights for optimizing operations or systems.2025TLTianren Luo et al.Institute of Software, Chinese Academy of Sciences; College of Computer Science and Technology, University of Chinese Academy of SciencesTeleoperated DrivingHuman-Robot Collaboration (HRC)CHI
Slip-Grip: An Electrotactile Method to Simulate WeightWeight perception is crucial for immersive virtual reality (VR) interactions, yet providing weight feedback remains a significant research challenge. We introduce a novel weight simulation technique that leverages electrotactile stimulation to induce slip illusions. These slip illusions occur when users grip an object with less force than a predefined threshold, allowing the device to modulate the grip force and encourage a tighter grip. In our approach, heavier virtual weights correspond to higher required grip forces. We conducted a series of user experiments to validate our technique, confirming that it effectively induces slip illusions. We also investigated the relationship between electrotactile sensations and grip force, and changes in force, demonstrating that this association enhances the weight perception experience. Lastly, we explored the mapping between grip force and perceived weight, observing strong linearity within participants but notable variability between individuals.2025HLHongnan Lin et al.Institute of Software, Chinese Academy of SciencesForce Feedback & Pseudo-Haptic WeightElectrical Muscle Stimulation (EMS)CHI
RemapVR: An Immersive Authoring Tool for Rapid Prototyping of Remapped Interaction in VRRemapping techniques in VR such as repositioning, redirection, and resizing have been extensively studied. Still, interaction designers rarely have the opportunity to use them due to high technical and knowledge barriers. In the paper, we extract common features of 24 existing remapping techniques and develop a high-fidelity immersive authoring tool, namely RemapVR, for rapidly building and experiencing prototypes of remapped space properties in VR that are unperceivable or acceptable to users. RemapVR provides designers with a series of functions for editing remappings and visualizing spatial property changes, mapping relationships between real and virtual worlds, sensory conflicts, etc. Designers can quickly build existing remappings via templates, and author new remappings by interactively recording spatial relations between input trajectory in real world and output trajectory in virtual world. User studies showed that the designs of RemapVR can effectively improve designers' authoring experience and efficiency, and support designers to author remapping prototypes that meet scene requirements and provide good user experience.2025TLTianren Luo et al.Institute of Software, Chinese Academy of Sciences; College of Computer Science and Technology, University of Chinese Academy of SciencesMixed Reality WorkspacesPrototyping & User TestingCHI
Exploring the Effects of Sensory Conflicts on Cognitive Fatigue in VR RemappingsVirtual reality (VR) is found to present significant cognitive challenges due to its immersive nature and frequent sensory conflicts. This study systematically investigates the impact of sensory conflicts induced by VR remapping techniques on cognitive fatigue, and unveils their correlation. We utilized three remapping methods (haptic repositioning, head-turning redirection, and giant resizing) to create different types of sensory conflicts, and measured perceptual thresholds to induce various intensities of the conflicts. Through experiments involving cognitive tasks along with subjective and physiological measures, we found that all three remapping methods influenced the onset and severity of cognitive fatigue, with visual-vestibular conflict having the greatest impact. Interestingly, visual-experiential/memory conflict showed a mitigating effect on cognitive fatigue, emphasizing the role of novel sensory experiences. This study contributes to a deeper understanding of cognitive fatigue under sensory conflicts and provides insights for designing VR experiences that align better with human perceptual and cognitive capabilities.2024TLTianren Luo et al.Eye Tracking & Gaze InteractionImmersion & Presence ResearchUIST
Understanding the Effects of Restraining Finger Coactivation in Mid-Air Typing: from a Neuromechanical PerspectiveTyping in mid-air is often perceived as intuitive yet presents challenges due to finger coactivation, a neuromechanical phenomenon that involves involuntary finger movements stemming from the lack of physical constraints. Previous studies were used to examine and address the impacts of finger coactivation using algorithmic approaches. Alternatively, this paper explores the neuromechanical effects of finger coactivation on mid-air typing, aiming to deepen our understanding and provide valuable insights to improve these interactions. We utilized a wearable device that restrains finger coactivation as a prop to conduct two mid-air studies, including a rapid finger-tapping task and a ten-finger typing task. The results revealed that restraining coactivation not only reduced mispresses, which is a classic coactivated error always considered as harm caused by coactivation. Unexpectedly, the reduction of motor control errors and spelling errors, thinking as non-coactivated errors, also be observed. Additionally, the study evaluated the neural resources involved in motor execution using functional Near Infrared Spectroscopy (fNIRS), which tracked cortical arousal during mid-air typing. The findings demonstrated decreased activation in the primary motor cortex of the left hemisphere when coactivation was restrained, suggesting a diminished motor execution load. This reduction suggests that a portion of neural resources is conserved, which also potentially aligns with perceived lower mental workload and decreased frustration levels.2024HZHechuan Zhang et al.Full-Body Interaction & Embodied InputUIST
WieldingCanvas: Interactive Sketch Canvases for Freehand Drawing in VRSketching in Virtual Reality (VR) is challenging mainly due to the absence of physical surface support and virtual depth perception cues, which induce high cognitive and sensorimotor load. This paper presents WieldingCanvas, an interactive VR sketching platform that integrates canvas manipulations to draw lines and curves in 3D. Informed by real-life examples of two-handed creative activities. WieldingCanvas interprets users' spatial gestures to move, swing, rotate, transform, or fold a virtual canvas, whereby users simply draw primitive strokes on the canvas, which are turned into finer and more sophisticated shapes via the manipulation of the canvas. We evaluated the capability and user experience of WieldingCanvas with three studies where participants were asked to sketch target shapes. A set of freehand sketches of high aesthetic qualities were created, and the results demonstrated that WieldingCanvas can assist users with creating 3D sketches.2024XTXiaohui Tan et al.Capital Normal UniversityMixed Reality Workspaces3D Modeling & AnimationInteractive Narrative & Immersive StorytellingCHI
TacTex: A Textile Interface with Seamlessly-Integrated Electrodes for High-Resolution electrotactile StimulationThis paper presents TacTex, a textile-based interface that provides high-resolution haptic feedback and touch-tracking capabilities. TacTex utilizes electrotactile stimulation, which has traditionally posed challenges due to limitations in textile electrode density and quantity. TacTex overcomes these challenges by employing a multi-layer woven structure that separates conductive weft and warp electrodes with non-conductive yarns. The driving system for TacTex includes a power supply, sensing board, and switch boards to enable spatial and temporal control of electrical stimuli on the textile, while simultaneously monitoring voltage changes. TacTex can stimulate a wide range of haptic effects, including static and dynamic patterns and different sensation qualities, with a resolution of $512 \times 512$ and \textcolor{black}{based on linear electrodes spaced as closely as 2mm}. We evaluate the performance of the interface with user studies and demonstrate the potential applications of TacTex interfaces in everyday textiles for adding haptic feedback.2024HLHongnan Lin et al.Institute of Software, Chinese Academy of SciencesVibrotactile Feedback & Skin StimulationShape-Changing Interfaces & Soft Robotic MaterialsElectronic Textiles (E-textiles)CHI
Exploring Experience Gaps Between Active and Passive Users During Multi-user Locomotion in VRMulti-user locomotion in VR has grown increasingly common, posing numerous challenges. A key factor contributing to these challenges is the gaps in experience between active and passive users during co-locomotion. Yet, there remains a limited understanding of how and to what extent these experiential gaps manifest in diverse multi-user co-locomotion scenarios. This paper systematically explores the gaps in physiological and psychological experience indicators between active and passive users across various locomotion situations. Such situations include when active users walk, fly by joystick, or teleport, and passive users stand still or look around. We also assess the impact of factors such as sub-locomotion type, speed/teleport-interval, motion sickness susceptibility, etc. Accordingly, we delineate acceptability disparities between active and passive users, offering insights into leveraging notable experimental findings to mitigate discomfort during co-locomotion through avoidance or intervention.2024TLTianren Luo et al.Institute of Software, College of Computer Science and TechnologySocial & Collaborative VRImmersion & Presence ResearchCHI
ThermoFit: Thermoforning Smart Orthoses via Metamaterial Structures for Body-Fitting and Component-Adjusting"Smart orthoses hold great potential for intelligent rehabilitation monitoring and training. However, most of these electronic assistive devices are typically too difficult for daily use and challenging to modify to accommodate variations in body shape and medical needs. For existing clinicians, the customization pipeline of these smart devices imposes significant learning costs. This paper introduces ThermoFit, an end-to-end design and fabrication pipeline for thermoforming smart orthoses that adheres to the clinically accepted procedure. ThermoFit enables the shapes and electronics positions of smart orthoses to conform to bodies and allows rapid iteration by integrating low-cost Low-Temperature Thermoplastics (LTTPs) with custom metamaterial structures and electronic components. Specifically, three types of metamaterial structures are used in LTTPs to reduce the wrinkles caused by the thermoforming process and to permit component position adjustment and joint movement. A design tool prototype aids in generating metamaterial patterns and optimizing component placement and circuit routing. Three applications show that ThermoFit can be shaped on bodies to different wearables. Finally, a hands-on study with a clinician verifies the user-friendliness of thermoforming smart orthosis, and technical evaluations demonstrate fabrication efficiency and electronic continuity. https://doi.org/10.1145/3580806"2023GWGuanyun Wang et al.Haptic WearablesCircuit Making & Hardware PrototypingUbiComp
EmTex: Prototyping Textile-Based Interfaces through An Embroidered Construction KitAs electronic textiles have become more advanced in sensing, ac- tuating, and manufacturing, incorporating smartness into fabrics has become of special interest to ubiquitous computing and interac- tion researchers and designers. However, innovating smart textile interfaces for numerous input and output modalities usually re- quires expert-level knowledge of specific materials, fabrication, and protocols. This paper presents EmTex, a construction kit based on embroidered textiles, patterned with dedicated sensing, actuating, and connecting components to facilitate the design and prototyp- ing of smart textile interfaces. With machine embroidery, EmTex is compatible with a wide range of threads and underlay fabrics, proficient in various stitches to control the electric parameters, and capable of integrating versatile and reliable interaction functionali- ties with aesthetic patterns and precise designs. EmTex consists of 28 textile-based sensors, actuators, connectors, and displays, pre- sented with standardized visual and tactile effects. Along with a visual programming tool, EmTex enables the prototyping of every- day textile interfaces for diverse life-living scenarios, that embody their touch input, and visual and haptic output properties. With EmTex, we conducted a workshop and invited 25 designers and makers to create freeform textile interfaces. Our findings revealed that EmTex helped the participants explore novel interaction oppor- tunities with various smart textile prototypes. We also identified challenges EmTex shall face for practical use in promoting the design innovation of smart textiles.2023QWQi Wang et al.Electronic Textiles (E-textiles)Desktop 3D Printing & Personal FabricationUIST
Exploring Locomotion Methods with Upright Redirected Views for VR Users in Reclining & Lying PositionsUsing VR in reclining & lying positions is getting common for users, but upward views caused by posture have to be redirected to be parallel to the ground as when users are standing. This affects users' locomotion performances in VR due to potential physical restrictions, and the visual-vestibular-proprioceptive conflict. This paper is among the first to investigate the suited locomotion methods and how reclining & lying positions and redirection affect them in such conditions. A user-elicitation study was carried out to construct a set of locomotion methods based on users' preferences when they were in different reclining & lying positions. A second study developed user-preferred 'tapping' and 'chair rotating' gestures, by evaluating their performances at various body reclining angles, we measured the general impacts of posture and redirection. The results showed that these methods worked effectively, but exposed some shortcomings, and users performed worst at 45-degree reclining angles. Finally, four upgraded methods were designed and verified to improve the locomotion performances.2023TLTianren Luo et al.Full-Body Interaction & Embodied InputImmersion & Presence ResearchIdentity & Avatars in XRUIST
Exploring Sensory Conflict Effect Due to Upright Redirection While Using VR in Reclining & Lying PositionsWhen users use Virtual Reality (VR) in nontraditional postures, such as while reclining or lying in relaxed positions, their views lean upwards and need to be corrected, to make sure they see upright contents and perceive the interactions as if they were standing. Such upright redirection is expected to cause visual-vestibular-proprioceptive conflict, affecting users' internal perceptions (e.g., body ownership, presence, simulator sickness) and external perceptions (e.g., egocentric space perception) in VR. Different body reclining angles may affect vestibular sensitivity and lead to the dynamic weighting of multi-sensory signals in the sensory integration. In the paper, we investigated the impact of upright redirection on users' perceptions, with users' physical bodies tilted at various angles backward and views upright redirected accordingly. The results showed that upright redirection led to simulator sickness, confused self-awareness, weak upright illusion, and increased space perception deviations to various extents when users are at different reclining positions, and the situations were the worst at the 45-degree conditions. Based on these results, we designed some illusion-based and sensory-based methods, that were shown effective in reducing the impact of sensory conflict through preliminary evaluations.2022TLTianren Luo et al.Motion Sickness & Passenger ExperienceFull-Body Interaction & Embodied InputImmersion & Presence ResearchUIST
HapTag: A Compact Actuator for Rendering Push-Button Tactility on Soft SurfacesAs touch interactions become ubiquitous in the field of human computer interactions, it is critical to enrich haptic feedback to improve efficiency, accuracy and immersive experiences. This paper presents HapTag, a thin and flexible actuator to support integration of push button tactile renderings to daily soft surfaces. Specifically, HapTag works under the principle of hydraulically amplified electroactive actuator (HASEL) while being optimized by embedding a pressure sensing layer, and being activated with dedicated voltage appliance in response to users' input actions, resulting in fast response time, controllable and expressive push-button tactile rendering capabilities. HapTag is in compact formfactor, and can be attached, integrated, or embedded on various soft surfaces like cloth, leather and rubber. Three common push button tactile patterns were adopted and implemented with HapTag. We validated the feasibility and expressiveness of HapTag by demonstrating a series of innovative applications under different circumstances.2022YCYanjun Chen et al.Vibrotactile Feedback & Skin StimulationHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
HoloBoard: a Large-format Immersive Teaching Board based on Pseudo HoloGraphicsIn this paper, we present HoloBoard, an interactive large-format pseduo-holographic display system for lecture based classes. With its unique properties of immersive visual display and transparent screen, we designed and implemented a rich set of novel interaction techniques like immersive presentation, role-play, and lecturing behind the scene that are potentially valuable for lecturing in class. We conducted a controlled experimental study to compare a HoloBoard class with a normal class through measuring students’ learning outcomes and three dimensions of engagement (i.e., behavioral, emotional, and cognitive engagement). We used pre-/post- knowledge tests and multimodal learning analytics to measure students’ learning outcomes and learning experiences. Results indicated that the lecture-based class utilizing HoloBoard lead to slightly better learning outcomes and a significantly higher level of student engagement. Given the results, we discussed the impact of HoloBoard as an immersive media in the classroom setting and suggest several design implications for deploying HoloBoard in immersive teaching practices.2021JGJiangtao Gong et al.Mixed Reality WorkspacesImmersion & Presence ResearchUIST
RElectrode: A Reconfigurable Electrode For Compound Sensing Based on MicrofluidicsIn this paper, we propose a reconfigurable electrode, RElectrode, using a microfluidic technique that can change the geometry and material properties of the electrode to satisfy the needs for sensing a variety of different types of user input through touch/touchless gestures, pressure, temperature, and distinguish between different types of objects or liquids. Unlike the existing approaches, which depend on specific-shaped electrode for particular sensing (e.g., coil for inductive sensing), RElectrode enables capacity, inductance, resistance/pressure, temperature, pH sensings all in a single package. We demonstrate the design and fabrication of the microfluidic structure of our RElectrode, evaluate its sensing performance through several studies, and provide some unique applications. RElectrode demonstrates technical feasibility and application values of integrating physical and biochemical properties of microfluidics into novel sensing interfaces.2021WSWei Sun et al.Institute of Software Chinese Academy of SciencesIn-Vehicle Haptic, Audio & Multimodal FeedbackVibrotactile Feedback & Skin StimulationForce Feedback & Pseudo-Haptic WeightCHI
vMirror: Enhancing the Interaction with Occluded or Distant Objects in VR with Virtual MirrorsInteracting with out of reach or occluded VR objects can be cumbersome. Although users can change their position and orientation, such as via teleporting, to help observe and select, doing so frequently may cause loss of spatial orientation or motion sickness. We present vMirror, an interactive widget leveraging reflection of mirrors to observe and select distant or occluded objects. We first designed interaction techniques for placing mirrors and interacting with objects through mirrors. We then conducted a formative study to explore a semi-automated mirror placement method with manual adjustments. Next, we conducted a target-selection experiment to measure the effect of the mirror's orientation on users' performance. Results showed that vMirror can be as efficient as direct target selection for most mirror orientations. We further compared vMirror with teleport technique in a virtual treasure hunt game and measured participants’ task performance and subjective experiences. Finally, we discuss vMirorr user experience and present future directions.2021NLNianlong Li et al.Institute of Software, Chinese Academy of Sciences, Institute of Software, Chinese Academy of SciencesSocial & Collaborative VRImmersion & Presence ResearchCHI
HapLinkage: Prototyping Haptic Proxies for Virtual Hand Tools Using Linkage MechanismHaptic simulation of hand tools like wrenches, pliers, scissors and syringes are beneficial for finely detailed skill training in VR, but designing for numerous hand tools usually requires an expert-level knowledge of specific mechanism and protocol. This paper presents HapLinkage, a prototyping framework based on linkage mechanism, that provides typical motion templates and haptic renderers to facilitate proxy design of virtual hand tools. The mechanical structures can be easily modified, for example, to scale the size, or to change the range of motion by selectively changing linkage lengths. Resistant, stop, release, and restoration force feedback are generated by an actuating module as part of the structure. Additional vibration feedback can be generated with a linear actuator. HapLinkage enables easy and quick prototypting of hand tools for diverse VR scenarios, that embody both of their kinetic and haptic properties. Based on interviews with expert designers, it was confirmed that HapLinkage is expressive in designing haptic proxy of hand tools to enhance VR experiences. It also identified potentials and future development of the framework.2020NLNianlong Li et al.Force Feedback & Pseudo-Haptic WeightHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
HapBead: On-Skin Microfluidic Haptic Interface using Tunable BeadOn-skin haptic interfaces using soft elastomers which are thin and flexible have significantly improved in recent years. Many are focused on vibrotactile feedback that requires complicated parameter tuning. Another approach is based on mechanical forces created via piezoelectric devices and other methods for non-vibratory haptic sensations like stretching, twisting. These are often bulky with electronic components and associated drivers are complicated with limited control of timing and precision. This paper proposes HapBead, a new on-skin haptic interface that is capable of rendering vibration like tactile feedback using microfluidics. HapBead leverages a microfluidic channel to precisely and agilely oscillate a small bead via liquid flow, which then generates various motion patterns in channel that creates highly tunable haptic sensations on skin. We developed a proof-of-concept design to implement thin, flexible and easily affordable HapBead platform, and verified its haptic rendering capabilities via attaching it to users' fingertips. A study was carried out and confirmed that participants could accurately tell six different haptic patterns rendered by HapBead. HapBead enables new wearable display applications with multiple integrated functionalities such as on-skin haptic doodles, visuo-haptic displays and haptic illusions.2020THTeng Han et al.Institute of Software, Chinese Academy of Sciences & University of Chinese Academy of SciencesVibrotactile Feedback & Skin StimulationHaptic WearablesBiosensors & Physiological MonitoringCHI
Mouillé: Exploring Wetness Illusion on Fingertips to Enhance Immersive Experience in VRProviding users with rich sensations is beneficial to enhance their immersion in Virtual Reality (VR) environments. Wetness is one such imperative sensation that affects users' sense of comfort and helps users adjust grip force when interacting with objects. Researchers have recently begun to explore ways to create wetness illusions, primarily on a user's face or body skin. In this work, we extended this line of research by creating wetness illusion on users' fingertips. We first conducted a user study to understand the effect of thermal and tactile feedback on users' perceived wetness sensation. Informed by the findings, we designed and evaluated a prototype---Mouillé---that provides various levels of wetness illusions on fingertips for both hard and soft items when users squeeze, lift, or scratch it. Study results indicated that users were able to feel wetness with different levels of temperature changes and they were able to distinguish three levels of wetness for simulated VR objects. We further presented applications that simulated an ice cube, an iced cola bottle, and a wet sponge, etc, to demonstrate its use in VR.2020THTeng Han et al.Institute of Software, Chinese Academy of Sciences & University of Chinese Academy of SciencesShape-Changing Interfaces & Soft Robotic MaterialsImmersion & Presence ResearchCHI
Get a Grip: Evaluating Grip Gestures for VR Input using a Lightweight PenThe use of Virtual Reality (VR) in applications such as data analysis, artistic creation, and clinical settings requires high precision input. However, the current design of handheld controllers, where wrist rotation is the primary input approach, does not exploit the human fingers' capability for dexterous movements for high precision pointing and selection. To address this issue, we investigated the characteristics and potential of using a pen as a VR input device. We conducted two studies. The first examined which pen grip allowed the largest range of motion---we found a tripod grip at the rear end of the shaft met this criterion. The second study investigated target selection via 'poking' and ray-casting, where we found the pen grip outperformed the traditional wrist-based input in both cases. Finally, we demonstrate potential applications enabled by VR pen input and grip postures.2020NLNianlong Li et al.Institute of Software, Chinese Academy of Sciences & University of Chinese Academy of SciencesFull-Body Interaction & Embodied InputSocial & Collaborative VRCHI