Repurposing Audio Playback Tools to Test Human Localization with 6DoF SoundSix-degree-of-freedom audio is a growing interest in interactive software, but it does not easily conform to object-based rendering when achieved with arrays of ambisonics microphones. Prior studies rely on subjective metrics also, which do not clearly indicate how this additional audio interaction might aid a human in a localization task—an indication of enhanced spatial awareness of a sound event. In this paper, we propose an alternative recording and playback technique to achieve six-degree-of-freedom audio to minimize recording overhead, yield object-based rendering, and verify enhanced spatialization through objective testing. The approach taken in this paper utilizes existing audio playback tools in the Unity game engine, and can be redeployed quickly to allow researchers outside of audio engineering exploration in six-degree-of-freedom audio applications. Two studies were conducted within a group of participants using a Microsoft Hololens 2—testing for interpretation of directional sound cues in a stationary position, and testing the proposed technique in a mobile task. Participants were able to discern additional information within the front-facing "blind spots" and were effectively perfect in a localization task with the proposed audio technique. Participants did not achieve the same performance level with a head-related transfer function alone—indicating meaningful cueing with six-degree-of-freedom sound.2025DRDan Rehberg et al.Foot & Wrist InteractionEye Tracking & Gaze InteractionMusic Composition & Sound Design ToolsUIST
There Is More to Dwell Than Meets the Eye: Toward Better Gaze-Based Text Entry Systems With Multi-Threshold DwellDwell-based text entry seems to peak at 20 words per minute (WPM). Yet, little is known about the factors contributing to this limit, except that it requires extensive training. Thus, we conducted a longitudinal study, broke the overall dwell-based selection time into six different components, and identified several design challenges and opportunities. Subsequently, we designed two novel dwell keyboards that use multiple yet much shorter dwell thresholds: Dual-Threshold Dwell (DTD) and Multi-Threshold Dwell (MTD). The performance analysis showed that MTD (18.3 WPM) outperformed both DTD (15.3 WPM) and the conventional Constant-Threshold Dwell (12.9 WPM). Notably, absolute novices achieved these speeds within just 30 phrases. Moreover, MTD’s performance is also the fastest-ever reported average text entry speed for gaze-based keyboards. Finally, we discuss how our chosen parameters can be further optimized to pave the way toward more efficient dwell-based text entry.2025AMAunnoy K Mutasim et al.Simon Fraser University, School of Interactive Arts and TechnologyEye Tracking & Gaze InteractionMotor Impairment Assistive Input TechnologiesCHI
A Systematic Review of Fitts’ Law in 3D Extended RealityFitts' law is widely used as an evaluation tool for pointing or selection tasks, evolving into diverse applications, including 3D extended reality (XR) environments like virtual, augmented, and mixed reality. Despite standards like ISO 9241:411, the application of Fitts' law varies significantly across studies, complicating comparisons and undermining the reliability of findings in 3D XR research. To address this, we conducted a systematic review of 119 publications, focusing on 122 studies that used Fitts' law in 3D XR user experiments. Our analysis shows that over half of these studies referenced Fitts' law without thoroughly investigating throughput, movement time, or error rate. We performed an in-depth meta-analysis to examine how Fitts' law is incorporated into research. By highlighting trends and inconsistencies, and making recommendations this review aims to guide researchers in designing and performing more effective and consistent Fitts-based studies in 3D XR, enhancing the quality and impact of future research.2025MAMohammadreza Amini et al.Concordia University, Department of Computer Science & Software EngineeringImmersion & Presence ResearchComputational Methods in HCICHI
An Artists' Perspectives on Natural Interactions for Virtual Reality 3D SketchingVirtual Reality (VR) applications like OpenBrush offer artists access to 3D sketching tools within the digital 3D virtual space. These 3D sketching tools allow users to ``paint'' using virtual digital strokes that emulate real-world mark-making. Yet, users paint these strokes through (unimodal) VR controllers. Given that sketching in VR is a relatively nascent field, this paper investigates ways to expand our understanding of sketching in virtual space, taking full advantage of what an immersive digital canvas offers. Through a study conducted with the participation of artists, we identify potential methods for natural multimodal and unimodal interaction techniques in 3D sketching. These methods demonstrate ways to incrementally improve existing interaction techniques and incorporate artistic feedback into the design.2024RRRichard Rodriguez et al.Colorado State University3D Modeling & AnimationDigital Art Installations & Interactive PerformanceCHI
EyeGuide & EyeConGuide: Gaze-based Visual Guides to Improve 3D Sketching SystemsVisual guides help to align strokes and raise accuracy in Virtual Reality (VR) sketching tools. Automatic guides that appear at relevant sketching areas are convenient to have for a seamless sketching with a guide. We explore guides that exploit eye-tracking to render them adaptive to the user's visual attention. EyeGuide and EyeConGuide cause visual grid fragments to appear spatially close to the user's intended sketches, based on the information of the user's eye-gaze direction and the 3D position of the hand. Here we evaluated the techniques in two user studies across simple and complex sketching objectives in VR. The results show that gaze-based guides have a positive effect on sketching accuracy, perceived usability and preference over manual activation in the tested tasks. Our research contributes to integrating gaze-contingent techniques for assistive guides and presents important insights into multimodal design applications in VR.2024RTRumeysa Turkmen et al.Kadir Has UniversityEye Tracking & Gaze InteractionMixed Reality Workspaces3D Modeling & AnimationCHI
Better Definition and Calculation of Throughput and Effective Parameters for Steering to Account for Subjective Speed-accuracy TradeoffsIn Fitts' law studies to investigate pointing, throughput is used to characterize the performance of input devices and users, which is claimed to be independent of task difficulty or the user's subjective speed-accuracy bias. While throughput has been recognized as a useful metric for target-pointing tasks, the corresponding formulation for path-steering tasks and its evaluation have not been thoroughly examined in the past. In this paper, we conducted three experiments using linear, circular, and sine-wave path shapes to propose and investigate a novel formulation for the effective parameters and the throughput of steering tasks. Our results show that the effective width substantially improves the fit to data with mixed speed-accuracy biases for all task shapes. Effective width also smoothed out the throughput across all biases, while the usefulness of the effective amplitude depended on the task shape. Our study thus advances the understanding of user performance in trajectory-based tasks.2024NKNobuhito Kasahara et al.Meiji UniversityUser Research Methods (Interviews, Surveys, Observation)CHI
AdaCAD: Parametric Design as a New Form of Notation for Complex WeavingWoven textiles are increasingly a medium through which HCI is inventing new technologies. Key challenges in integrating woven textiles in HCI include the high level of textile knowledge required to make effective use of the new possibilities they afford and the need for tools that bridge the concerns of textile designers and concerns of HCI researchers. This paper presents AdaCAD, a parametric design tool for designing woven textile structures. Through our design and evaluation of AdaCAD we found that parametric design helps weavers notate and explain the logics behind the complex structures they generate. We discuss these finding in relation to prior work in integrating craft and/or weaving in HCI, histories of woven notation, and boundary object theory to illuminate further possibilities for collaboration between craftspeople and HCI practitioners.2023LDLaura Devendorf et al.University of Colorado BoulderTextile Art & Craft DigitizationCHI
Secret Lives of Data Publics: Mixed Reality Smart City InterfacesConventional smart city design processes tend to focus on instrumental planning for city systems or novel services for humans. Interacting with data produced by the new services and restructured systems entailed by these processes is commonly done via interfaces like civic dashboards, leading to a critique that data-driven urbanism is bound by the rules and constraints of dashboard design [1]. Informed citizens are expected to engage with new urban information flows through the logic of dashboard interfaces. What datastreams are left off the dashboard of engaged urban experience? What design opportunities arise when dashboard visualizations are moved into the domain of mixed reality? In this two-day workshop, participants will construct prototype mixed reality interfaces for engaging the informational layer of the built urban environment. Using the Unity game engine and the Microsoft HoloLens, participants will focus on generative design in the space of data-driven interfaces, addressing issues of data access, civic agency, and privacy in the context of smart cities. Specific attention will be paid to interfaces that facilitate harmonious co-existence between humans and non-human systems (AI, IoT, etc.).2018GRGabriel Resch et al.University of TorontoContext-Aware ComputingSmart Home Interaction DesignSmart Cities & Urban SensingCHI