Introspectus AI: Long-term AI-Driven Dialogue Training To Promote Self-ReflectionIntrospectusAI is a generative AI-based system designed to enhance self-reflection and support positive behavior change. By leveraging multimodal information from users' daily life recordings, it provides personalized and detailed feedback, aiming to deepen self-awareness and facilitate positive behavioral adjustments. This study explores the short-term and long-term impacts of interacting with IntrospectusAI, focusing on its potential to enhance reflective practices and improve the acceptance of generative AI tools. Following the user experience was defined through an initial round of workshops with four experts. The resulting system was evaluated through a long-term study involving 64 participants. The results demonstrate that AI-supported interventions significantly improved engagement in self-reflection, the need for reflection, and insight, while also increasing user acceptance of generative AI over time. These findings underscore the potential of generative AI as a practical tool for self-improvement, offering insights into its broader applicability in promoting well-being and personal growth.2025SLShengyin Li et al.Communicating With/Through AICSCW
The Grounded Experience: The Effect of Floor Design Typologies on Human Behavioral and Cognitive ExperienceDesign reflects the human tendency to adapt to and inhabit surroundings, with architectural decisions significantly shaping behavioral and cognitive experiences. This pictorial focuses on the floor as a primary, embodied interface in space. To explore its influence, five floor design typologies (completing, switching, zoning, stimulating, and bending) were identified through twenty hours of expert discussions involving architects, designers, an artist, an engineer, and researchers. A collaborative workshop further defined sub-categories via participatory observations. Fieldwork then informed site selection for an observational study, which confirmed the behavioral and cognitive impacts of the identified typologies. Based on these findings, floor codes were developed by shifting the design focus from visual cues to somatic sensations and applied in design scenarios. This research contributes to understanding human experience in architectural environments. It offers insights for virtual architecture, proposing evidence-based strategies for designing personalized and interactive spaces in virtual and mixed-reality contexts.2025BDBurcu Nimet Dumlu et al.Mixed Reality WorkspacesDigital Art Installations & Interactive PerformanceInteractive Narrative & Immersive StorytellingDIS
A Placebo Concert: The Placebo Effect for Visualization of Physiological Audience Data during Experience Recreation in Virtual RealityA core use case for Virtual Reality applications is recreating real-life scenarios for training or entertainment. Promoting physiological responses for users in VR that match those of real-life spectators can maximize engagement and contribute to more co-presence. Current research focuses on visualizations and measurements of physiological data to ensure experience accuracy. However, placebo effects are known to influence performance and self-perception in HCI studies, creating a need to investigate the effect of visualizing different types of data (real, unmatched, and fake) on user perception during event recreation in VR. We investigate these conditions through a balanced between-groups study (n=44) of uninformed and informed participants. The informed group was provided with the information that the data visualizations represented previously recorded human physiological data. Our findings reveal a placebo effect, where the informed group demonstrated enhanced engagement and co-presence. Additionally, the fake data condition in the informed group evoked a positive emotional response.2025XMXiaru Meng et al.Keio University Graduate School of Media DesignSocial & Collaborative VRImmersion & Presence ResearchVisualization Perception & CognitionCHI
“It’s Like Being On Stage”: Conveying Dancers’ Expressiveness Through A Haptic-Installed Contemporary Dance PerformanceIn dance performances, choreography, music and lighting are combined to convey meaning to the audience. However, this communication typically relies on visual and auditory stimuli alone. While haptic technologies have been leveraged to enhance the perception of dancers’ movements, less focus has been placed on exploring their potential in enhancing dancers’ somatic expressiveness. Through co-design activities with 5 professional contemporary dancers, we crafted an interdisciplinary combination of choreography and haptics. In total, 128 audience members watched one of three live performances while wearing custom-made haptic wristbands. From an open-ended questionnaire and interviews with audience members, we explore how the introduction of haptics deepens their embodied sensations and helps to create a sense of resonance with the dancers. Based on our findings, we discuss implications for future directions in how haptic technologies could drive innovation in dance performances from the point of view of both dancers’ creativity and audience experiences.2025XSXiming Shen et al.Keio University Graduate School of Media DesignHaptic WearablesDance & Body Movement ComputingCHI
Haptic Empathy: Investigating Individual Differences in Affective Haptic CommunicationsNowadays, touch remains essential for emotional conveyance and interpersonal communication as more interactions are mediated remotely. While many studies have discussed the effectiveness of using haptics to communicate emotions, incorporating affect into haptic design still faces challenges due to individual user tactile acuity and preferences. We assessed the conveying of emotions using a two-channel haptic display, emphasizing individual differences. First, 24 participants generated 187 haptic messages reflecting their immediate sentiments after watching 8 emotionally charged film clips. Afterwards, 19 participants were asked to identify emotions from haptic messages designed by themselves and others, yielding 593 samples. Our findings indicate that the ability to decode haptic messages is linked to specific emotional traits, particularly Emotional Competence (EC) and Affect Intensity Measure (AIM). Additionally, qualitative analysis revealed three strategies participants used to create touch messages: perceptive, empathetic, and metaphorical expression.2025YJYulan Ju et al.Keio University Graduate School of Media DesignVibrotactile Feedback & Skin StimulationHaptic WearablesAgent Personality & AnthropomorphismCHI
Living Bento: Heartbeat-Driven Noodles for Enriched Dining DynamicsTo enhance focused eating and dining socialization, previous Human-Food Interaction research has indicated that external devices can support these dining objectives and immersion. However, methods that focus on the food itself and the diners themselves have remained underdeveloped. In this study, we integrated biofeedback with food, utilizing diners' heart rates as a source of the food's appearance to promote focused eating and dining socialization. By employing LED lights, we dynamically displayed diners' real-time physiological signals through the transparency of the food. Results revealed significant effects on various aspects of dining immersion, such as awareness perceptions, attractiveness, attentiveness to each bite, and emotional bonds with the food. Furthermore, to promote dining socialization, we established a “Sharing Bio-Sync Food” dining system to strengthen emotional connections between diners. Based on these findings, we developed tableware that integrates biofeedback into the culinary experience.2025WCWeijen Chen et al.Keio University Graduate School of Media DesignBiosensors & Physiological MonitoringFood Culture & Food InteractionCHI
SealMates: Improving Communication in Video Conferencing using a Collective Behavior-Driven AvatarThe limited nonverbal cues and spatially distributed nature of remote communication make it challenging for unacquainted members to be expressive during social interactions over video conferencing. Though it enables seeing others’ facial expressions, the visual feedback can instead lead to unexpected self-focus, resulting in users missing cues for others to engage in the conversation equally. To support expressive communication and equal participation among unacquainted counterparts, we propose SealMates, a behavior-driven avatar in which the avatar infers the engagement level of the group based on collective gaze and speech patterns and then moves across interlocutors' windows in the video conferencing. By conducting a controlled experiment with 15 groups of triads, we found the avatar's movement encouraged people to experience more self-disclosure and made them perceive everyone was equally engaged in the conversation than when there was no behavior-driven avatar. We discuss how a behavior-driven avatar influences distributed members' perceptions and the implications of avatar-mediated communication for future platforms.2024MAMark Armstrong et al.Session 4f: Multiplayer Gaming and CommunicationCSCW
DexteriSync: A Hand Thermal I/O Exoskeleton for Morphing Finger Dexterity ExperienceSkin temperature is an important physiological factor for human hand dexterity. Leveraging this feature, we engineered an exoskeleton, called DexteriSync, that can dynamically adjust the user's finger dexterity and induce different thermal perceptions by modulating finger skin temperature. This exoskeleton comprises flexible silicone-copper tube segments, 3D-printed finger sockets, a 3D-printed palm base, a pump system, and a water temperature control with a storage unit. By realising an embodied experience of compromised dexterity, DexteriSync can help product designers understand the lived experience of compromised hand dexterity, such as that of the elderly and/or neurodivergent users, when designing daily necessities for them. We validated DexteriSync via a technical evaluation and two user studies, demonstrating that it can change skin temperature, dexterity, and thermal perception. An exploratory session with design students and an autistic compromised dexterity individual, demonstrated the exoskeleton provided a more realistic experience compared to video education, and allowed them to gain higher confidence in their designs. The results advocated for the efficacy of experiencing embodied compromised finger dexterity, which can promote an understanding of the related physical challenges and lead to a more persuasive design for assistive tools.2024XSXiming Shen et al.Haptic WearablesMotor Impairment Assistive Input TechnologiesUIST
People with Disabilities Redefining Identity through Robotic and Virtual Avatars: A Case Study in Avatar Robot CafeRobotic avatars and telepresence technology enable people with disabilities to engage in physical work. Despite the recent popularity of the metaverse, few studies have explored the use of virtual avatars and environments by people with disabilities. In this study, seven disabled participants working in a cafe where remote customer service is provided via robotic avatars, were engaged in the development and use of personalized virtual avatars displayed on a large screen in-situ in combination with existing physical robots, creating a hybrid cyber-physical space. We conducted longitudinal semi-structured interviews to investigate the psychological changes experienced by the participants. The results revealed that mass-produced robotic avatars allowed participants to not disclose their disability if they did not want to, but also backgrounded their identities; by contrast, customized virtual avatars shaped without physical constraints, highlighted their personalities. The combined use of robotic and virtual avatars complemented each other and can support pilots in redefining their identity.2024YHYuji Hatada et al.The University of TokyoIdentity & Avatars in XRSocial Robot InteractionTeleoperation & TelepresenceCHI
Cymatics Cup: Shape-Changing Drinks by Leveraging CymaticsTo enhance the dining experience, prior studies in Human-Computer Interaction (HCI) and gastrophysics have demonstrated that modifying the static shape of solid foods can amplify taste perception. However, the exploration of dynamic shape-changing mechanisms in liquid foods remains largely untapped. In the present study, we employ cymatics, a scientific discipline focused on utilizing sound frequencies to generate patterns in liquids and particles—to augment the drinking experience. Utilizing speakers, we dynamically reshaped liquids exhibiting five distinct taste profiles and evaluated resultant changes in taste perception and drinking experience. Our research objectives extend beyond merely augmenting taste from visual to tactile sensations; we also prioritize the experiential aspects of drinking. Through a series of experiments and workshops, we revealed a significant impact on taste perception and overall drinking experience when mediated by cymatics effects. Building upon these findings, we designed and developed tableware to integrate cymatics principles into gastronomic experiences.2024WCWeijen Chen et al.Keio University Graduate School of Media DesignFood Culture & Food InteractionCHI
Maintaining Continuing Bonds in Bereavement: A Participatory Design Process of Be.sideDuring the grieving process, physical objects often serve as catalysts for remembering and honouring the relationship with departed loved ones. Leveraging a participatory design approach, we created Be.side, a fully customisable multi-modal artefact that incorporates scent, sound, and heartbeat stimulation and acts as a touch-point between the deceased and the bereaved. We conducted a four-week study with three participants to understand how the artefact, continuously attuned to each participant, helped to continue bonds with the deceased. Our results show that Be.side’s bespoke elements helped participants to evoke memories of the deceased. Participants created personalised rituals for remembrance. They sustained bonds by not only interacting with Be.side but also participating in the research. Finally, highlighting that remembrance can both provide comfort and deepen sadness, we discuss future design considerations.2024JKJieun Kim et al.Keio University Graduate School of Media Design, Myongji UniversityHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsFood Culture & Food InteractionCHI
“I am both here and there” Parallel Control of Multiple Robotic Avatars by Disabled Workers in a CafeRobotic avatars can help disabled people extend their reach in interacting with the world. Technological advances make it possible for individuals to embody multiple avatars simultaneously. However, existing studies have been limited to laboratory conditions and did not involve disabled participants. In this paper, we present a real-world implementation of a parallel control system allowing disabled workers in a café to embody multiple robotic avatars at the same time to carry out different tasks. Our data corpus comprises semi-structured interviews with workers, customer surveys, and videos of café operations. Results indicate that the system increases workers' agency, enabling them to better manage customer journeys. Parallel embodiment and transitions between avatars create multiple interaction loops where the links between disabled workers and customers remain consistent, but the intermediary avatar changes. Based on our observations, we theorize that disabled individuals possess specific competencies that increase their ability to manage multiple avatar bodies.2023GBGiulia Barbareschi et al.Keio UniversityDomestic RobotsSocial Robot InteractionRobots in Education & HealthcareCHI
Dementia Eyes: Co-Design and Evaluation of a Dementia Education Augmented Reality Experience for Medical WorkersDementia describes a syndrome of cognitive degeneration, and Behavioural and Psychological Symptoms of Dementia (BPSD) is the non-cognitive symptom. BPSD can be improved by care services. To aid better care service, we explore the potential of using Augmented Reality (AR) to support dementia education for medical workers in three steps: (1) We explore medical workers' perspective on dementia care lived experience and XR, (2) we co-design an educational experience containing an AR-based application and a 5-min activity with medical workers, (3) we evaluate the effectiveness of the system through a mixed method study. Our result shows that the AR experience successfully touches participants, and motivates them to reflect on the provision of care service. On this basis, we discuss the elements and challenges of designing XR-enabled dementia education for users unfamiliar with novel technology, and the potential of using XR in clinical education.2023XSXiming Shen et al.Keio University Graduate School of Media DesignAR Navigation & Context AwarenessVR Medical Training & RehabilitationCHI
ThermalBracelet: Exploring Thermal Haptic Feedback Around the WristSmartwatches enable the wrist to be used as an ideal location to provide always-available haptic notifications as they are constantly worn with direct contact with the skin. With the wrist straps, the haptic feedback can be extended to the full space around the wrist to provide more spatial and enriched feedback. With ThermalBracelet, we investigate thermal feedback as a haptic feedback modality around the wrist. We present three studies that lead to the development of a smartwatch-integratable thermal bracelet that stimulates six locations around the wrist. Our initial evaluation reports on the selection of the thermal module configurations. Secondly, with the selected six-module configuration, we explore its usability in a real-world scenarios such as walking and reading. Thirdly, we investigate its capability of providing spatio temporal feedback while engaged in distracting tasks. Finally we present application scenarios that demonstrates its usability.2019RPRoshan Lalitha Peiris et al.Keio University Graduate School of Media DesignFoot & Wrist InteractionBiosensors & Physiological MonitoringContext-Aware ComputingCHI
MetaArms: Body Remapping Using Feet-Controlled Artificial ArmsWe introduce MetaArms, wearable anthropomorphic robotic arms and hands with six degrees of freedom operated by the user’s legs and feet. Our overall research goal is to re-imagine what our bodies can do with the aid of wearable robotics using a body-remapping approach. To this end, we present an initial exploratory case study. MetaArms’ two robotic arms are controlled by the user’s feet motion, and the robotic hands can grip objects according to the user’s toes bending. Haptic feedback is also presented on the user’s feet that correlate with the touched objects on the robotic hands, creating a closed-loop system. We present formal and informal evaluations of the system, the former using a 2D pointing task according to Fitts’ Law. The overall throughput for 12 users of the system is reported as 1.01 bits/s (std 0.39). We also present informal feedback from over 230 users. We find that MetaArms demonstrate the feasibility of body-remapping approach in designing robotic limbs that may help us re-imagine what the human body could do.2018MSMHD Yamen Saraiji et al.In-Vehicle Haptic, Audio & Multimodal FeedbackForce Feedback & Pseudo-Haptic WeightHuman-Robot Collaboration (HRC)UIST