Mold-It: Understanding how Physical Shapes affect Interaction with Handheld Freeform DevicesAdvanced technologies are increasingly enabling the creation of interactive devices with non-rectangular form-factors but it is currently unclear what alternative form-factors are desirable for end users. We contribute an understanding of the interplay between the rationale for the form factors of such devices and their interactive content through think aloud design sessions in which participants could mold devices as they wished using clay. We analysed their qualitative reflections on how the shapes affected interaction. Using thematic analysis, we identified shape features desirable on handheld freeform devices and discuss the particularity of three themes central to the choice of form factors: freeform dexterity, shape features discoverability and shape adaptability (to the task and context). In a second study following the same experimental set-up, we focused on the trade off between dexterity and discoverability and the relation to the concept of affordance. Our work reveals the shape features that impact the most the choice of grasps on freeform devices from which we derive design guidelines for the design of such devices.2022MSMarcos Serrano et al.IRIT - ElipseShape-Changing Interfaces & Soft Robotic MaterialsPrototyping & User TestingCHI
BezelGlide: Interacting with Graphs on Smartwatches with Minimal Screen OcclusionWe present BezelGlide, a novel suite of bezel interaction techniques, designed to minimize screen occlusion and `fat finger' effects, when interacting with common graphs on smartwatches. To explore the design of BezelGlide, we conducted two user studies. First, we quantified the amount of screen occlusion experienced when interacting with the smartwatch bezel. Next, we designed two techniques that involve gliding the finger along the smartwatch bezel for graph interaction. Full BezelGlide (FBG) and Partial BezelGlide (PBG), use the full or a portion of the bezel, respectively, to reduce screen occlusion while scanning a line chart for data. In the common value detection task, we find that PBG outperforms FBG and Shift, a touchscreen occlusion-free technique, both quantitatively and subjectively, also while mobile. We finally illustrate the generalizability potential of PBG to interact with common graph types making it a valuable interaction technique for smartwatch users.2021ANAli Neshati et al.University of ManitobaInteractive Data VisualizationSmartwatches & Fitness BandsCHI
Elbow-Anchored Interaction: Designing Restful Mid-Air InputWe designed a mid-air input space for restful interactions on the couch. We observed people gesturing in various postures on a couch and found that posture affects the choice of arm motions when no constraints are imposed by a system. Study participants that sat with the arm rested were more likely to use the forearm and wrist, as opposed to the whole arm. We investigate how a spherical input space, where forearm angles are mapped to screen coordinates, can facilitate restful mid-air input in multiple postures. We present two controlled studies. In the first, we examine how a spherical space compares with a planar space in an elbow-anchored setup, with a shoulder-level input space as baseline. In the second, we examine the performance of a spherical input space in four common couch postures that set unique constraints to the arm. We observe that a spherical model that captures forearm movement facilitates comfortable input across different seated postures.2021RVRafael Veras et al.HuaweiMid-Air Haptics (Ultrasonic)Full-Body Interaction & Embodied InputCHI
Get a Grip: Evaluating Grip Gestures for VR Input using a Lightweight PenThe use of Virtual Reality (VR) in applications such as data analysis, artistic creation, and clinical settings requires high precision input. However, the current design of handheld controllers, where wrist rotation is the primary input approach, does not exploit the human fingers' capability for dexterous movements for high precision pointing and selection. To address this issue, we investigated the characteristics and potential of using a pen as a VR input device. We conducted two studies. The first examined which pen grip allowed the largest range of motion---we found a tripod grip at the rear end of the shaft met this criterion. The second study investigated target selection via 'poking' and ray-casting, where we found the pen grip outperformed the traditional wrist-based input in both cases. Finally, we demonstrate potential applications enabled by VR pen input and grip postures.2020NLNianlong Li et al.Institute of Software, Chinese Academy of Sciences & University of Chinese Academy of SciencesFull-Body Interaction & Embodied InputSocial & Collaborative VRCHI
An Analytic Model for Time Efficient Personal HierarchiesHierarchy structures such as file systems are widespread interfaces for item retrieval and selection tasks. Some hierarchies can be modified by end-users, such as application launchers on smartphones or pictures in a file folder. These modifiable hierarchies cannot benefit from an optimization made beforehand as their content, unknown during the design process, is constantly evolving. We hence propose an analytic model which designers can integrate in their system to recommend a range of local structure modifications (e.g., creating new folders) to end-users. Proposing a range of modifications gives flexibility to end-users regarding their own meaningful grouping and labeling choices to follow a recommendation. A first experiment confirms that the recommendations built on our model can lead to modified hierarchies resulting in faster theoretical selection times. A second experiment confirms that the theoretical selection times fit empirical selection times in different hierarchy visual layouts: linear, radial, and grid.2019WDWilliam Delamare et al.Kochi University of Technology & University of ManitobaUser Research Methods (Interviews, Surveys, Observation)Prototyping & User TestingCHI
PinchList: Leveraging Pinch Gestures for Hierarchical List Navigation on SmartphonesIntensive exploration and navigation of hierarchical lists on smartphones can be tedious and time-consuming as it often requires users to frequently switch between multiple views. To overcome this limitation, we present PinchList, a novel interaction design that leverages pinch gestures to support seamless exploration of multi-level list items in hierarchical views. With PinchList, sub-lists are accessed with a pinch-out gesture whereas a pinch-in gesture navigates back to the previous level. Additionally, pinch and flick gestures are used to navigate lists consisting of more than two levels. We conduct a user study to refine the design parameters of PinchList such as a suitable item size, and quantitatively evaluate the target acquisition performance using pinch-in/out gestures in both scrolling and non-scrolling conditions. In a second study, we compare the performance of PinchList in a hierarchal navigation task with two commonly used touch interfaces for list browsing: pagination and expand-and-collapse interfaces. The results reveal that PinchList is significantly faster than other two interfaces in accessing items located in hierarchical list views. Finally, we demonstrate that PinchList enables a host of novel applications in list-based interaction?2019THTeng Han et al.University of ManitobaHand Gesture RecognitionCHI
Finding Information on Non-Rectangular InterfacesWith upcoming breakthroughs in free-form display technologies, new user interface design challenges have emerged. Here, we investigate a question, which has been widely explored on traditional GUIs but unexplored on non-rectangular interfaces: what are the user strategies in terms of visual search when information is not presented in a traditional rectangular layout? To achieve this, we present two complementary studies investigating eye movements in different visual search tasks. Our results unveil which areas are seen first according to different visual structures. By doing so we address the question of where to place relevant content for the UI designers of non-rectangular displays.2019FSFlorine Simon et al.University of ToulouseVisualization Perception & CognitionPrototyping & User TestingCHI
HydroRing: Supporting Mixed Reality Haptics Using Liquid FlowCurrent haptic devices are often bulky and rigid, making them unsuitable for ubiquitous interaction and scenarios where the user must also interact with the real world. To address this gap, we propose HydroRing, an unobtrusive, finger-worn device that can provide the tactile sensations of pressure, vibration, and temperature on the fingertip, enabling mixed-reality haptic interactions. Different from previous explorations, HydroRing in active mode delivers sensations using liquid travelling through a thin, flexible latex tube worn across the fingerpad, and has minimal impact on a user’s dexterity and their perception of stimuli in passive mode. Two studies evaluated participants’ ability to perceive and recognize sensations generated by the device, as well as their ability to perceive physical stimuli while wearing the device. We conclude by exploring several applications leveraging this mixed-reality haptics approach.2018THTeng Han et al.In-Vehicle Haptic, Audio & Multimodal FeedbackHaptic WearablesUIST
D-SWIME: A Design Space for Smartwatch Interaction Techniques Supporting Mobility and EncumbranceSmartwatches enable rapid access to information anytime and anywhere. However, current smartwatch content navigation techniques, for panning and zooming, were directly adopted from those used on smartphones. These techniques are cumbersome when performed on small smartwatch screens and have not been evaluated for their support in mobility and encumbrance contexts (when the user’s hands are busy). We studied the effect of mobility and encumbrance on common content navigation techniques and found a significant decrease in performance as the pace of mobility increases or when the user was encumbered with busy hands. Based on these initial findings, we proposed a design space which would improve efficiency when navigation techniques, such as panning and zooming, are employed in mobility contexts. Our results reveal that our design space can effectively be used to create novel interaction techniques that improve smartwatch content navigation in mobility and encumbrance contexts.2018GSGaganpreet Singh et al.University of ManitobaSmartwatches & Fitness BandsContext-Aware ComputingCHI
Crowdsourcing vs Laboratory-Style Social Acceptability Studies? Examining the Social Acceptability of Spatial User Interactions for Head-Worn DisplaysThe use of crowdsourcing platforms for data collection in HCI research is attractive in their ability to provide rapid access to large and diverse participant samples. As a result, several researchers have conducted studies investigating the similarities and differences between data collected through crowdsourcing and more traditional, laboratory-style data collection. We add to this body of research by examining the feasibility of conducting social acceptability studies via crowdsourcing. Social acceptability can be a key determinant for the early adoption of emerging technologies, and as such, we focus our investigation on social acceptability for Head-Worn Display (HWD) input modalities. Our results indicate that data collected via a crowdsourced experiment and a laboratory-style setting did not differ at a statistically significant level. These results provide initial support for crowdsourcing platforms as viable options for conducting social acceptability research.2018FAFouad Alallah et al.University of ManitobaUser Research Methods (Interviews, Surveys, Observation)Prototyping & User TestingCHI
PageFlip: Leveraging Page-Flipping Gestures for Efficient Command and Value Selection on SmartwatchesSelecting an item of interest on smartwatches can be tedious and time-consuming as it involves a series of swipe and tap actions. We present PageFlip, a novel method that combines into a single action multiple touch operations such as command invocation and value selection for efficient interaction on smartwatches. PageFlip operates with a page flip gesture that starts by dragging the UI from a corner of the device. We first design PageFlip by examining its key design factors such as corners, drag directions and drag distances. We next compare PageFlip to a functionally equivalent radial menu and a standard swipe and tap method. Results reveal that PageFlip improves efficiency for both discrete and continuous selection tasks. Finally, we demonstrate novel smartwatch interaction opportunities and a set of applications that can benefit from PageFlip.2018THTeng Han et al.University of ManitobaFoot & Wrist InteractionSmartwatches & Fitness BandsCHI