Exploring Opportunities for Flexible Wearables to Support Physical TrainingAdvances in digital fabrication with flexible and conductive materials are enabling new opportunities for customisable wearables that enhance physical training across various sports through on-body posture and movement monitoring. However, before developing new wearable devices, it is crucial to understand the unique challenges and opportunities of real-world use and the diverse needs of stakeholders such as athletes, trainers, and sports enthusiasts. In this paper, we report results of two co-design workshops conducted with 11 participants representing a range of sports backgrounds and interests to identify challenges and opportunities for flexible wearables. The participants' designs and prototypes highlight the importance of supporting personalised wearable design, addressing varying contexts, and accommodating different skill levels. We propose design considerations for wearable devices that prioritise flexibility, both in their materiality to support a wide range of movement and in their adaptability to accommodate changing conditions and user progression across skill levels.2025KSKatarzyna Stawarz et al.Haptic WearablesFitness Tracking & Physical Activity MonitoringDIS
TactStyle: Generating Tactile Textures with Generative AI for Digital FabricationRecent work in Generative AI enables the stylization of 3D models based on image prompts. However, these methods do not incorporate tactile information, leading to designs that lack the expected tactile properties. We present TactStyle, a system that allows creators to stylize 3D models with images while incorporating the expected tactile properties. TactStyle accomplishes this using a modified image-generation model fine-tuned to generate heightfields for given surface textures. By optimizing 3D model surfaces to embody a generated texture, TactStyle creates models that match the desired style and replicate the tactile experience. We utilize a large-scale dataset of textures to train our texture generation model. In a psychophysical experiment, we evaluate the tactile qualities of a set of 3D-printed original textures and TactStyle's generated textures. Our results show that TactStyle successfully generates a wide range of tactile features from a single image input, enabling a novel approach to haptic design.2025FFFaraz Faruqi et al.MIT CSAILForce Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsGenerative AI (Text, Image, Music, Video)CHI
Can AI Prompt Humans? Multimodal Agents Prompt Players’ Game Actions and Show Consequences to Raise Sustainability AwarenessUnsustainable behaviors are challenging to prevent due to their long-term, often unclear consequences. Serious games offer a promising solution by creating artificial environments where players can immediately experience the outcomes of their actions. To explore this potential, we developed EcoEcho, a GenAI-powered game leveraging multimodal agents to raise sustainability awareness. These agents engage players in natural conversations, prompting them to take in-game actions that lead to visible environmental impacts. We evaluated EcoEcho using a mixed-methods approach with 23 participants. Results show a significant increase in intended sustainable behaviors post-game, although attitudes towards sustainability had only marginal effects, suggesting that in-game actions likely can motivate intended real world behaviors despite similar opinions on sustainability. This finding highlights multimodal agents and action-consequence mechanics to effectively raising sustainability awareness and the potential of motivating real-world behavioral change.2025QZQinshi Zhang et al.University of California, San Diego, University of California, San DiegoGenerative AI (Text, Image, Music, Video)Serious & Functional GamesSustainable HCICHI
Understanding and Improving the Performance of Action PointingAction pointing involves choosing and executing an action at a specific place in the workspace (e.g., choosing a tool and clicking to start drawing, or selecting an object and copying with a shortcut). The elements of action pointing (choosing an action, specifying a position, and triggering the action) can be carried out in many ways - and our analysis of current techniques identified limitations on performance, particularly for repeated sequences of interactions. To empirically analyse interaction alternatives for action pointing, we developed and evaluated two techniques: ModeKeys removes modifier keys from keyboard shortcuts used to choose actions; AimKeys goes further by using the shortcut (not the mouse) to trigger the action. Three studies over three tasks showed that these reconfigurations were highly effective - in all studies, either AimKeys or ModeKeys were faster, easier, and preferred overall. Our studies show that small variations in the configuration of action pointing can have a large impact, offering opportunities to improve performance with direct-manipulation systems.2025CBCameron Beattie et al.University of SaskatchewanFull-Body Interaction & Embodied InputKnowledge Worker Tools & WorkflowsCHI
CollabJam: Studying Collaborative Haptic Experience Design for On-Body Vibrotactile PatternsDesigning vibrotactile experiences collaboratively requires communicating using multiple senses. This is challenging in remote scenarios as designers need to effectively express and communicate their intention while iteratively building and refining experiences, ideally in real-time. We formulate design considerations for collaborative haptic design tools, and propose CollabJam, a collaborative prototyping suite enabling remote synchronous design of vibrotactile experiences for on-body applications. We first outline CollabJam’s features and present a technical evaluation. Second, we use CollabJam to understand communication and design patterns used during haptic experience design. We performed an in-depth design evaluation spanning four sessions in which four pairs of participants designed and reviewed vibrotactile experiences remotely. A qualitative content analysis revealed how multi-sensory communication is essential to convey ideas, how stimulating the tactile sense can interfere with personal boundaries, and how freely placing actuators on the skin can provide both benefits and challenges.2025DWDennis Wittchen et al.Dresden University of Applied Sciences, Faculty of Informatics / Mathematics; Max Planck Institute for Informatics, Saarland Informatics Campus, Sensorimotor InteractionVibrotactile Feedback & Skin StimulationHaptic WearablesCreative Collaboration & Feedback SystemsCHI
Motion-Coupled Asymmetric Vibration for Pseudo Force Rendering in Virtual RealityIn Virtual Reality (VR), rendering realistic forces is crucial for immersion, but traditional vibrotactile feedback fails to convey force sensations effectively. Studies of asymmetric vibrations that elicit pseudo forces show promise but are inherently tied to unwanted vibrations, reducing realism. Leveraging sensory attenuation to reduce the perceived intensity of self-generated vibrations during user movement, we present a novel algorithm that couples asymmetric vibrations with user motion, which mimics self-generated sensations. Our psychophysics study with 12 participants shows that motion-coupled asymmetric vibration attenuates the experience of vibration (equivalent to a \textasciitilde 30\% reduction in vibration-amplitude) while preserving the experience of force, compared to continuous asymmetric vibrations (state-of-the-art). We demonstrate the effectiveness of our approach in VR through three scenarios: shooting arrows, lifting weights, and simulating haptic magnets. Results revealed that participants preferred forces elicited by motion-coupled asymmetric vibration for tasks like shooting arrows and lifting weights. This research highlights the potential of motion-coupled asymmetric vibrations, offers new insights into sensory attenuation, and advances force rendering in VR.2025NSNihar Sabnis et al.Max Planck Institute for Informatics, Saarland Informatics Campus, Sensorimotor InteractionForce Feedback & Pseudo-Haptic WeightCHI
Spatial Haptics: A Sensory Substitution Method for Distal Object Detection Using Tactile CuesWe present a sensory substitution-based method for representing locations of remote objects in 3D space via haptics. By imitating auditory localization processes, we enable vibrotactile localization abilities similar to those of some spiders, elephants, and other species. We evaluated this concept in virtual reality by modulating the vibration amplitude of two controllers depending on relative locations to a target. We developed two implementations applying this method using either ear or hand locations. A proof-of-concept study assessed localization performance and user experience, achieving under 30° differentiation between horizontal targets with no prior training. This unique approach enables localization by using only two actuators, requires low computational power, and could potentially assist users in gaining spatial awareness in challenging environments. We compare the implementations and discuss the use of hands as ears in motion, a novel technique not previously explored in the sensory substitution literature.2025IWIddo Yehoshua Wald et al.University of Bremen, Digital Media LabVibrotactile Feedback & Skin StimulationFull-Body Interaction & Embodied InputCHI
Effects of Device Environment and Information Layout on Spatial Memory and Performance in VR Selection TasksVirtual Reality systems are increasingly proposed as a platform for everyday interactive software. Many applications are dependent on actions such as navigation and selection, but it is not clear how well immersive environments support these basic activities. Previous studies have suggested advantages for spatial learning in VR, so we carried out a study that investigated two aspects of immersion on spatial memory and selection: the degree to which the user is immersed in the data, and whether the system uses immersive input and output. The study showed that more-immersive conditions had substantially worse selection performance, and did not improve spatial learning. However, most participants believed that the immersive conditions were better for learning object locations, and most people preferred the immersive layout and the HMD. Our study suggests that designers should be cautious about assuming that everyday software applications will benefit from being deployed in an immersive VR environment.2024KKKim Kargut et al.University of SaskatchewanEye Tracking & Gaze InteractionImmersion & Presence ResearchCHI
GestureExplorer: Immersive Visualisation and Exploration of Gesture DataThis paper presents the design and evaluation of GestureExplorer, an Immersive Analytics tool that supports the interactive exploration, classification and sensemaking with large sets of 3D temporal gesture data. GestureExplorer features 3D skeletal and trajectory visualisations of gestures combined with abstract visualisations of clustered sets of gestures. By leveraging the large immersive space afforded by a Virtual Reality interface our tool allows free navigation and control of viewing perspective for users to gain a better understanding of gestures. We explored a selection of classification methods to provide an overview of the dataset that was linked to a detailed view of the data that showed different visualisation modalities. We evaluated GestureExplorer with two user studies and collected feedback from participants with diverse visualisation and analytics backgrounds. Our results demonstrated the promising capability of GestureExplorer for providing a useful and engaging experience in exploring and analysing gesture data.2023ALAng Li et al.Monash UniversityHand Gesture RecognitionInteractive Data VisualizationCHI
`Specially For You' -- Examining the Barnum Effect's Influence on the Perceived Quality of System RecommendationsThe ‘Barnum effect’ is a psychological phenomenon under which people assign higher quality ratings to personality descriptions developed ‘specially for you’ than the same descriptions described as ‘generally true of people.’ This effect suggests that recommender interfaces could elevate the perceived quality of recommendations simply by indicating that they are explicitly personalised. We therefore conducted a crowd-sourced experiment (n=492) that examined the perceived quality of personalised versus non-personalised movie recommendations for good and bad movies – importantly, the actual recommendations were identical, and were merely presented as being either personalised or not. Contrary to the Barnum effect, results showed numerically lower mean quality scores for personalised recommendations, but with no significant difference. Our findings suggest that Barnum-like effects of personalisation have at most a small influence on perceived quality, and that designers should not rely on this effect to improve user experience (despite online design guidance suggesting the opposite).2023PSPang Suwanaposee et al.University of CanterburyRecommender System UXVisualization Perception & CognitionCHI
XR-LIVE: Enhancing Asynchronous Shared-Space Demonstrations with Spatial-temporal Assistive Toolsets for Effective Learning in Immersive Virtual LaboratoriesAn immersive virtual laboratory (VL) could offer flexibility of time and space, as well as safety, for remote students to conduct laboratory activities through online experiential learning. Recording an instructor’s demonstration inside a VL is an approach that allows students to learn directly from a demonstration. However, students have to learn from a recording while controlling the playback, which requires additional spatial and temporal attention. This additional attention load could lead to mistakes in following laboratory procedures. We have identified four design requirements to reduce attention load in VLs; namely, organized learning steps, improved student sense of co-presence, reduction of task-instructor split-attention, and learning independent of interpersonal distance. Based on these requirements, we have designed and implemented spatial-temporal assistive toolsets for laboratories in virtual environment, namely XR-LIVE, that reduces mental load and enhance effective learning in an asynchronous shared-space demonstration, implemented based on the setup of a standard civil engineering laboratory. We also analyzed students’ behavior in the VL demonstration to design guidelines applicable to generic VLs.2022STSantawat Thanyadit et al.XR in Place and Space; XR in Place and SpaceCSCW
VRhook: A Data Collection Tool for VR Motion Sickness ResearchDespite the increasing popularity of VR games, one factor hindering the industry's rapid growth is motion sickness experienced by the users. Symptoms such as fatigue and nausea severely hamper the user experience. Machine Learning methods could be used to automatically detect motion sickness in VR experiences, but generating the extensive labeled dataset needed is a challenging task. It needs either very time consuming manual labeling by human experts or modification of proprietary VR application source codes for label capturing. To overcome these challenges, we developed a novel data collection tool, VRhook, which can collect data from any VR game without needing access to its source code. This is achieved by dynamic hooking, where we can inject custom code into a game's run-time memory to record each video frame and its associated transformation matrices. Using this, we can automatically extract various useful labels such as rotation, speed, and acceleration. In addition, VRhook can blend a customized screen overlay on top of game contents to collect self-reported comfort scores. In this paper, we describe the technical development of VRhook, demonstrate its utility with an example, and describe directions for future research.2022EWElliott Wen et al.Motion Sickness & Passenger ExperienceImmersion & Presence ResearchUIST
Probability Weighting in Interactive Decisions: Evidence for Overuse of Bad Assistance, Underuse of Good AssistanceThe effective use of assistive interfaces (i.e. those that offer suggestions or reform the user's input to match inferred intentions) depends on users making good decisions about whether and when to engage or ignore assistive features. However, prior work from economics and psychology shows systematic decision-making biases in which people overreact to low probability events and underreact to high probability events -- modelled using a probability weighting function. We examine the theoretical implications of this probability weighting for interaction, including its suggestion that users will overuse inaccurate interface assistance and underuse accurate assistance. We then conduct a new analysis of data from a previously published study, quantifying the degree of bias users exhibited, and demonstrating conformance with these predictions. We discuss implications for design, including strategies that could be used to mitigate the deleterious effects of the observed biases.2022ACAndy Cockburn et al.University of CanterburyExplainable AI (XAI)AI-Assisted Decision-Making & AutomationAI Ethics, Fairness & AccountabilityCHI
More Errors vs. Longer Commands: The Effects of Repetition and Reduced Expressiveness on Input Interpretation Error, Learning, and User PreferenceMany interactive systems are susceptible to misinterpreting the user's input actions or gestures. Interpretation errors are common when systems gather a series of signals from the user and then attempt to interpret the user's intention based on those signals -- e.g., gesture identification from a touchscreen, camera, or body-worn electrodes -- and previous work has shown that interpretation error can cause significant problems for learning new input commands. Error-reduction strategies from telecommunications, such as repeating a command or increasing the length of the input while reducing its expressiveness, could improve these input mechanisms -- but little is known about whether longer command sequences will cause problems for users (e.g., increased effort or reduced learning). We tested performance, learning, and perceived effort in a crowd-sourced study where participants learned and used input mechanisms with different error-reduction techniques. We found that error reduction techniques are feasible, can outperform error-prone ordinary input, and do not negatively affect learning or perceived effort.2022KLKevin C. Lam et al.University of SaskatchewanHand Gesture RecognitionHuman Pose & Activity RecognitionCHI
On the Use of Multi-sensory Cues in Symmetric and Asymmetric Shared Collaborative Virtual SpacesPhysical face-to-face collaboration with someone gives a higher-quality experience compared to mediated communication options, such as a phone- or video-based chat. Participants can share rich sensory cues to multiple human senses in a physical space. Also, the perceptual sensing of the surrounding environment including other peoples' reactions can influence human communication and emotion, and thus collaborative performance. Shared spaces in virtual environments provide degraded sensory experiences because most commercial virtual reality systems typically provide only visual and audio feedback. The impact of richer, multi-sensory feedback on joint decision-making tasks in VR is still an open area of research. Two independent studies exploring this topic are presented in this paper. We implemented a multi-sensory system that delivers vision, audio, tactile, and smell feedback, and we compared the system to a typical VR system. The scenario placed two users in a virtual theme-park safari ride with a number of non-player character (NPC) passengers to simulate realistic scenarios compared to the real-world and we varied the type and complexity of NPCs reactions to participants. In Experiment 1, we provided both users with either multi-sensory or typical sensory feedback symmetrically as a between-subjects factor, and used NPC reaction type as a within-subjects factor. In Experiment 2, we provided sensory feedback asymmetrically to each user (i.e., one had multi-sensory cues and the other had typical sensory cues) as a between-subjects factor, and used NPC reaction type as a within-subjects factor. We found that the number of sensory channels and NPC reactions did not influence user perception significantly under either symmetric or asymmetric sensory feedback conditions. However, after accounting for individual personality traits (e.g., assertive, passive), as well as any existing relationship between the pairs, we found that increasing the number of sensory channels can significantly improve subjective responses.2021SJSungchul Jung et al.VR and Immersive InterfacesCSCW
The Effects of System Interpretation Errors on Learning New Input MechanismsInput mechanisms can produce noisy signals that computers must interpret, and this interpretation can misconstrue the user’s intention. Researchers have studied how interpretation errors can affect users’ task performance, but little is known about how these errors affect learning, and whether they help or hinder the transition to expertise. Previous findings suggest that increasing the user’s attention can facilitate learning, so frequent interpretation errors may increase attention and learning; alternatively, however, interpretation errors may negatively interfere with skill development. To explore these potentially important effects, we conducted studies where participants learned commands with various rates of artificially injected interpretation errors. Our results showed that higher rates of interpretation error led to worse memory retention, higher completion times, higher occurrences of user error (beyond those injected by the system), and greater perceived effort. These findings indicate that when input mechanisms must interpret the user's input, interpretation errors cause problems for user learning.2021KLKevin C. Lam et al.University of SaskatchewanHand Gesture RecognitionEye Tracking & Gaze InteractionCHI
Interaction Pace and User PreferencesThe overall pace of interaction combines the user's pace and the system's pace, and a pace mismatch could impair user preferences (e.g., animations or timeouts that are too fast or slow for the user). Motivated by studies of speech rate convergence, we conducted an experiment to examine whether user preferences for system pace are correlated with user pace. Subjects first completed a series of trials to determine their user pace. They then completed a series of hierarchical drag-and-drop trials in which folders automatically expanded when the cursor hovered for longer than a controlled timeout. Results showed that preferences for timeout values correlated with user pace -- slow-paced users preferred long timeouts, and fast-paced users preferred short timeouts. Results indicate potential benefits in moving away from fixed or customisable settings for system pace. Instead, systems could improve preferences by automatically adapting their pace to converge towards that of the user.2021AGAlix Goguey et al.Université Grenoble AlpesVisualization Perception & CognitionCHI
Interaction Interferences: Implications of Last-Instant System State ChangesWe study interaction interferences, situations where an unexpected change occurs in an interface immediately before the user performs an action, causing the corresponding input to be misinterpreted by the system. For example, a user tries to select an item in a list, but the list is automatically updated immediately before the click, causing the wrong item to be selected. First, we formally define interaction interferences and discuss their causes from behavioral and system-design perspectives. Then, we report the results of a survey examining users’ perceptions of the frequency, frustration, and severity of interaction interferences. We also report a controlled experiment, based on state-of-the-art experimental protocols from neuroscience, that explores the minimum time interval, before clicking, below which participants could not refrain from completing their action. Finally, we discuss our findings and their implications for system design, paving the way for future work.2020PSPhilippe Schmid et al.Privacy by Design & User ControlNotification & Interruption ManagementUIST
KeyMap: Improving Keyboard Shortcut Vocabulary Using Norman's MappingWe introduce a new shortcut interface called KeyMap that is designed to leverage Norman's principle of natural mapping. Rather than displaying shortcut command labels in linear menus, KeyMap displays a virtual keyboard with command labels displayed directly on its keys. A crowdsourced experiment compares KeyMap to Malacria et al.'s ExposeHK using an extension of their protocol to also test recall. Results show KeyMap users remembered 1 more shortcut than ExposeHK immediately after training, and this advantage increased to 4.5 more shortcuts when tested again after 24 hours. KeyMap users also incidentally learned more shortcuts that they had never practised. We demonstrate how KeyMap can be added to existing web-based applications using a Chrome extension.2020BLBlaine Lewis et al.University of TorontoPrototyping & User TestingCHI
Framing Effects Influence Interface Feature DecisionsStudies in psychology have shown that framing effects, where the positive or negative attributes of logically equivalent choices are emphasised, influence people's decisions. When outcomes are uncertain, framing effects also induce patterns of choice reversal, where decisions tend to be risk averse when gains are emphasised and risk seeking when losses are emphasised. Studies of these effects typically use potent framing stimuli, such as the mortality of people suffering from diseases or personal financial standing. We examine whether these effects arise in users' decisions about interface features, which typically have less visceral consequences, using a crowd-sourced study based on snap-to-grid drag-and-drop tasks (n = 842). The study examined several framing conditions: those similar to prior psychological research, and those similar to typical interaction choices (enabling/disabling features). Results indicate that attribute framing strongly influences users' decisions, that these decisions conform to patterns of risk seeking for losses, and that patterns of choice reversal occur.2020ACAndy Cockburn et al.University of CanterburyExplainable AI (XAI)Visualization Perception & CognitionUser Research Methods (Interviews, Surveys, Observation)CHI