Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin StretchVirtual body extensions such as a wing or tail have the potential to offer users new bodily experiences and capabilities in virtual and augmented reality. To use these extensions as naturally as one’s own body—particularly for body parts that are normally hard to see, such as a tail—it is essential to provide proprioceptive feedback that allows users to perceive the position, orientation, and force exerted by these parts, rather than relying solely on visual cues. In this study, we propose a novel approach by introducing an "Imaginary Joint" at the interface between the user's actual body and the virtual extension, delivering information about joint flexion and force through skin-stretch feedback. We present a wearable device for skin-stretch feedback and explore informing mappings that convey the bending rotation and torque of the Imaginary Joint. The final system presents both types of information simultaneously by superimposing these skin deformations. Results from a controlled experiment with users demonstrate that users could identify tail position and force without relying on visual cues, and do so more effectively than in the vibrotactile condition. Furthermore, the tail was perceived as more embodied than in a vibrotactile condition, resulting in a more naturalistic and intuitive sensation. Finally, we introduce several application scenarios, including Perception of Extended Bodies, Enhanced Bodily Expression, and Body-Mediated Communication, and discuss the potential for future extensions of this system.2025STShuto Takashita et al.Haptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsDance & Body Movement ComputingUIST
Hydroptical Thermal Feedback: Spatial Thermal Feedback Using Visible Lights and WaterWe control the temperature of materials in everyday interactions, recognizing temperature's important influence on our bodies, minds, and experiences. However, thermal feedback is an under-explored modality in human-computer interaction partly due to its limited temporal (slow) and spatial (small-area and non-moving) capabilities. We introduce hydroptical thermal feedback, a spatial thermal feedback method that works by applying visible lights on body parts in water. Through physical measurements and psychophysical experiments, our results show: (1) Humans perceive thermal sensations when visible lights are cast on the skin under water, and perceived warmth is greater for lights with shorter wavelengths, (2) temporal capabilities, (3) apparent motion (spatial) of warmth and coolness sensations, and (4) hydroptical thermal feedback can support the perceptual illusion that the water itself is warmer. We propose applications, including virtual reality (VR), shared water experiences, and therapies. Overall, this paper contributes hydroptical thermal feedback as a novel method, empirical results demonstrating its unique capabilities, proposed applications, and design recommendations for using hydroptical thermal feedback. Our method introduces controlled, spatial thermal perceptions to water experiences.2024SISosuke Ichihashi et al.Vibrotactile Feedback & Skin StimulationMixed Reality WorkspacesImmersion & Presence ResearchUIST
Embodied Tentacle: Mapping Design to Control of Non-Analogous Body Parts with the Human BodyManipulating a non-humanoid body using a mapping approach that translates human body activity into different structural movements enables users to perform tasks that are difficult with their innate bodies. However, a key challenge is how to design an effective mapping to control non-analogous body parts with the human body. To address this challenge, we designed an articulated virtual arm and investigated the effect of mapping methods on a user's manipulation experience. Specifically, we developed an unbranched 12-joint virtual arm with an octopus-like appearance. Using this arm, we conducted a user study to compare the effects of several mapping methods with different arrangements on task performance and subjective evaluations of embodiment and user preference. As a result, we identified three important factors in mapping: "Visual and Configurational Similarity", "Kinematics Suitability for the User", and "Correspondence with Everyday Actions." Based on these findings, we discuss a mapping design for non-humanoid body manipulation.2024STShuto Takashita et al.The University of TokyoShape-Changing Interfaces & Soft Robotic MaterialsFull-Body Interaction & Embodied InputCHI
WRLKit: Computational Design of Personalized Wearable Robotic LimbsWearable robotic limbs (WRLs) augment human capabilities through robotic structures that attach to the user’s body. While WRLs are intensely researched and various device designs have been presented, it remains difficult for non-roboticists to engage with this exciting field. We aim to empower interaction designers and application domain experts to explore novel designs and applications by rapidly prototyping personalized WRLs that are customized for different tasks, different body locations, or different users. In this paper, we present WRLKit, an interactive computational design approach that enables designers to rapidly prototype a personalized WRL without requiring extensive robotics and ergonomics expertise. The body-aware optimization approach starts by capturing the user’s body dimensions and dynamic body poses. Then, an optimized fabricable structure of the WRL is generated for a desired mounting location and workspace of the WRL, to fit the user’s body and intended task. The results of a user study and several implemented prototypes demonstrate the practical feasibility and versatility of WRLKit.2023AAArtin Saberpour Abadian et al.Shape-Changing Interfaces & Soft Robotic MaterialsHuman-Robot Collaboration (HRC)UIST
WeightMorphy: A dynamic weight-shifting method to enhance the virtual experience with body deformationWe propose WeightMorphy, a hand-mounted system designed to improve the operability and immersive experience of teleoperation manipulation by changing the moment of inertia. This system reduces the discrepancy between the visual shape of the virtual hand and its corresponding moment of inertia, enabling instantaneous control by the user while maintaining accuracy. We have provided a detailed description of the design and concept of our system and conducted experiments to examine the effect of shifting the center of gravity on the operability of the deformable virtual hand using WeightMorphy.2023NONaoki Okamoto et al.Force Feedback & Pseudo-Haptic WeightUbiComp
Social Digital Cyborgs: The Collaborative Design Process of the JIZAI ARMSHalf a century since the concept of a cyborg was introduced, digital cyborgs, enabled by the spread of wearable robotics, are the focus of much research in recent times. We introduce JIZAI ARMS, a supernumerary robotic limb system consisting of a wearable base unit with six terminals and detachable robot arms controllable by the wearer. The system was designed to enable social interaction between multiple wearers, such as an exchange of arm(s), and explore possible interactions between digital cyborgs in a cyborg society. This paper describes the JIZAI ARMS' design process, an interdisciplinary collaboration between human augmentation researchers, product designers, a system architect, and manufacturers, to realize a technically complex system while considering the aesthetics of a digital cyborg. We also provide an autobiographical report of our first impressions of using the JIZAI ARMS and use our findings to speculate on a model of potential social interactions between digital cyborgs.2023NYNahoko Yamamura et al.University of TokyoShape-Changing Interfaces & Soft Robotic MaterialsCHI
Flexel: A Modular Floor Interface for Room-Scale Tactile SensingHuman environments are physically supported by floors, which prevents people and furniture from falling against gravitational pull. Since our body motions continuously generate vibrations and loads that propagate to the ground, measurement of these expressive signals leads to unobtrusive activity sensing. In this study, we present Flexel, a modular floor interface for room-scale tactile sensing. By paving a room with floor interfaces, our system can immediately begin to infer touch positions, track user locations, recognize foot gestures, and detect object locations. Through a series of exploratory studies, we figured out the preferable hardware design that adheres to construction conventions, as well as the optimal sensor density that mediates the trade-off between costs and performance. In addition, we summarize a design guideline that is generalizable to other floor interfaces. Moreover, we demonstrate example applications for room-scale tactile sensing enabled by Flexel systems.2022TYTakatoshi Yoshida et al.Mid-Air Haptics (Ultrasonic)Foot & Wrist InteractionUIST
Machine-Mediated Teaming: Mixture of Human and Machine in Physical Gaming ExperienceTechnological advancement has opened up opportunities for new sports and physical activities. We introduce a concept called {\it machine-mediated teaming}, in which a human and a surrogate machine form a team to participate in physical sports games. To understand the experience of machine-mediated teaming and the guidelines for designing the system to achieve the concept, we built a case study system based on tug-of-war. Our system is a sports game played by two against two. One team consists of a player who actually pulls the rope and another player who participates in the physical game by controlling the machine's actuators. We conducted user studies using this system to investigate the sport experience in this form and to reveal insights to inform future research on machine-mediated teaming. Based on the data obtained from the user studies, we clarified three perspectives, machine stamina, action space, and explicit feedback, that should be considered when designing future machine-mediated teaming systems. The research presented in this paper offers a first step towards exploring how humans and machines can coexist in highly dynamic physical interactions.2022AMAzumi Maekawa et al.The University of TokyoFull-Body Interaction & Embodied InputSerious & Functional GamesCHI
Generating the Presence of Remote Mourners: a Case Study of Funeral Webcasting in JapanFunerals are irreplaceable events, especially for bereaved family members and relatives. However, the COVID-19 pandemic has prevented many people worldwide from attending their loved ones' funerals. The authors had the opportunity to assist one family faced with this predicament by webcasting and recording funeral rites held near Tokyo in June, 2020. Using our original 360-degree Telepresence system and smartphones running Zoom, we enabled the deceased's elder siblings to remotely attend the funeral and did our utmost to make them feel present in the funeral hall. Despite the webcasting via Zoom contributing more to their remote attendances than our system, we discovered thoughtful findings which could be useful for designing remote funeral attendances. From the findings, we also discuss how HCI designers can contribute to this highly sensitive issue, weaving together knowledge from various domains including techno-spiritual practices, thanato-sensitive designs; and other religious and cultural aspects related to death rituals.2021DUDaisuke Uriu et al.The University of TokyoTeleoperation & TelepresenceDigital Art Installations & Interactive PerformanceCHI
Floral Tribute Ritual in Virtual Reality: Design and Validation of SenseVase with Virtual MemorialWhile floral tributes are commonly used for the public commemoration of victims of disasters, war, and other accidents, flowers in vases color everyday life. In this research, these features of flowers are intertwined with the recent phenomenon of online memorials to develop a virtual floral tribute concept that includes physical rituals. We designed SenseVase, a smart vase to detect flowers placed in it, and a 3DCG Virtual Memorial that illustrates floral tributes given by people using SenseVases at home. This paper describes how we developed our design concept by reviewing previous literature and social aspects, and presents a video illustrating the concept. To validate the current concept, we interviewed several experts knowledgeable in public commemorations, virtual and online communities, and the floral business. Through a discussion of our findings from the design process and interviews, we propose a new direction for how HCI technology can contribute to public commemoration in addition to personal memorialization.2021DUDaisuke Uriu et al.The University of TokyoDigital Art Installations & Interactive PerformanceMuseum & Cultural Heritage DigitizationCHI
Dynamic Motor Skill Synthesis with Human-Machine Mutual ActuationThis paper presents an approach for coupling robotic capability with human ability in dynamic motor skills, called "Human-Machine Mutual Actuation (HMMA)." We focus specifically on throwing motions and propose a method to control the release timing computationally. A system we developed achieves our concept, HMMA, by a robotic handheld device that acts as a release controller. We conducted user studies to validate the feasibility of the concept and clarify related technical issues to be tackled. We recognized that the system successfully performs on throwing according to the target while it exploits human ability. These empirical experiments suggest that robotic capability can be embedded into the users' motions without losing their senses of control. Throughout the user study, we also revealed several issues to be tackled in further research contributing to HMMA.2020AMAzumi Maekawa et al.The University of TokyoHuman-Robot Collaboration (HRC)CHI
Next Steps for Human-Computer IntegrationHuman-Computer Integration (HInt) is an emerging paradigm in which computational and human systems are closely interwoven. Integrating computers with the human body is not new. however, we believe that with rapid technological advancements, increasing real-world deployments, and growing ethical and societal implications, it is critical to identify an agenda for future research. We present a set of challenges for HInt research, formulated over the course of a five-day workshop consisting of 29 experts who have designed, deployed and studied HInt systems. This agenda aims to guide researchers in a structured way towards a more coordinated and conscientious future of human-computer integration.2020FMFlorian Floyd Mueller et al.Monash UniversityBrain-Computer Interface (BCI) & NeurofeedbackTechnology Ethics & Critical HCIUser Research Methods (Interviews, Surveys, Observation)CHI
MetaArms: Body Remapping Using Feet-Controlled Artificial ArmsWe introduce MetaArms, wearable anthropomorphic robotic arms and hands with six degrees of freedom operated by the user’s legs and feet. Our overall research goal is to re-imagine what our bodies can do with the aid of wearable robotics using a body-remapping approach. To this end, we present an initial exploratory case study. MetaArms’ two robotic arms are controlled by the user’s feet motion, and the robotic hands can grip objects according to the user’s toes bending. Haptic feedback is also presented on the user’s feet that correlate with the touched objects on the robotic hands, creating a closed-loop system. We present formal and informal evaluations of the system, the former using a 2D pointing task according to Fitts’ Law. The overall throughput for 12 users of the system is reported as 1.01 bits/s (std 0.39). We also present informal feedback from over 230 users. We find that MetaArms demonstrate the feasibility of body-remapping approach in designing robotic limbs that may help us re-imagine what the human body could do.2018MSMHD Yamen Saraiji et al.In-Vehicle Haptic, Audio & Multimodal FeedbackForce Feedback & Pseudo-Haptic WeightHuman-Robot Collaboration (HRC)UIST