A Systematic Literature Review to Characterize Asymmetric Interaction in Collaborative SystemsComputer-mediated collaboration often relies on symmetrical interactions between users, where all the collaborators use identical devices. However, in some cases, either due to constraints (e.g. users in different environments) or by choice (e.g. using devices with different properties), users engage in asymmetrical interactions. Addressing such asymmetries in heterogeneous systems can be difficult as there has been no systematic analysis of how to define them, or their impact on collaboration. In this paper, we characterize the asymmetries that can arise between users’ interactions within collaborative heterogeneous systems. To this end, we conduct a systematic literature review of asymmetric collaborative systems, coding their properties, including the interaction spaces, their input and output modalities, and shared feedback. We then define the dimensions of asymmetry that emerge from this review. We discuss their impact on collaboration and outline a set of challenges and opportunities for future research.2025VBVictor Bréhault et al.Université Toulouse 3, IRIT - ELIPSEDistributed Team CollaborationPrototyping & User TestingCHI
Grip-Reach-Touch-Repeat: A Refined Model of Grasp to Encompass One-Handed Interaction with Arbitrary Form Factor DevicesWe extend grasp models to encompass one-handed interaction with arbitrary shaped touchscreen devices. Current models focus on how objects are stably held by external forces. However, with touchscreen devices, we postulate that users do a trade-off between holding securely and exploring interactively. To verify this, we first conducted a qualitative study which asked participants to grasp 3D printed objects while considering its different interactivity. Results of the study confirm our hypothesis and reveal obvious change in postures. To further verify this trade-off and design interactions, we developed a simulation software capable of computing the stability of a grasp and its reachability. We conducted the second study based on the observed predominant grasps to validate our software with a glove. Results also confirm a consistent trade-off between stability and reachability. We conclude by discussing how this research can help designing computational tools focusing on hand-held interactions with arbitrary shaped touchscreen devices.2024KZKaixing Zhao et al.Northwestern Polytechnical UniversityHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsHand Gesture RecognitionCHI
User-Driven Constraints for Layout Optimisation in Augmented RealityAutomatic layout optimisation allows users to arrange augmented reality content in the real-world environment without the need for tedious manual interactions. This optimisation is often based on modelling the intended content placement as constraints, defined as cost functions. Then, applying a cost minimization algorithm leads to a desirable placement. However, such an approach is limited by the lack of user control over the optimisation results. In this paper we explore the concept of user-driven constraints for augmented reality layout optimisation. With our approach users can define and set up their own constraints directly within the real-world environment. We first present a design space composed of three dimensions: the constraints, the regions of interest and the constraint parameters. Then we explore which input gestures can be employed to define the user-driven constraints of our design space through a user elicitation study. Using the results of the study, we propose a holistic system design and implementation demonstrating our user-driven constraints, which we evaluate in a final user study where participants had to create several constraints at the same time to arrange a set of virtual contents.2023ANAziz Niyazov et al.IRIT - University of ToulouseAR Navigation & Context AwarenessMixed Reality WorkspacesPrototyping & User TestingCHI
HoloBar: Rapid Command Execution for Head-Worn AR Exploiting Around the Field-of-View InteractionInefficient menu interfaces lead to system and application commands being tedious to execute in Immersive Environments. HoloBar is a novel approach to ease the interaction with multi-level menus in immersive environments: with HoloBar, the hierarchical menu splits between the field of view (FoV) of the Head Mounted Display and the smartphone (SP). Command execution is based on around-the-FoV interaction with the SP, and touch input on the SP display. The HoloBar offers a unique combination of features, namely rapid mid-air activation, implicit selection of top-level items and preview of second-level items on the SP, ensuring rapid access to commands. In a first study we validate its activation method, which consists in bringing the SP within an activation distance from the FoV. In a second study, we compare the HoloBar to two alternatives, including the standard HoloLens menu. Results show that the HoloBar shortens each step of a multi-level menu interaction (menu activation, top-level item selection, second-level item selection and validation), with a high success rate. A follow-up study confirms that these results remain valid when compared with the two validation mechanisms of HoloLens (Air-Tap and clicker).2021HSHoussem Saidi et al.IRIT - ElipseAR Navigation & Context AwarenessMixed Reality WorkspacesHuman-LLM CollaborationCHI
KeyTch: Combining the Keyboard with a Touchscreen for Rapid Command Selection on ToolbarsIn this paper, we address the challenge of reducing mouse pointer transitions from the working object (e.g. text document) to simple or multi-level toolbars on desktop computers. To this end, we introduce KeyTch (pronounced ‘Keetch’), a novel approach for command selection on toolbars based on the combined use of the keyboard with a touchscreen. The toolbar is displayed on the touchscreen, which is positioned below the keyboard. Users can select commands by performing gestures combining a key press with the pinky finger, and a screen touch with the thumb of the same hand. After analyzing the design properties of KeyTch, a preliminary experiment validates that users can perform such gestures and reach the entire touchscreen surface with the thumb. Then a first user study unveils that direct touch outperforms indirect pointing to reach items on a simple toolbar displayed on the touchscreen. In a second study, we validate that KeyTch interaction techniques outperform the mouse for selecting items on a multi-level toolbar displayed on the touchscreen, allowing to select up to 720 commands with an accuracy above 95%, or 480 commands with an accuracy above 97%. Finally, two follow-up studies validate the benefits of KeyTch when used in a more integrated context.2021EKElio Keddisseh et al.Universite Paul Sabatier, Oktal SydacFoot & Wrist InteractionKnowledge Worker Tools & WorkflowsCHI
Tactile Fixations: A Behavioral Marker on How People with Visual Impairments Explore Raised-line GraphicsRaised-line graphics are tactile documents made for people with visual impairments (VI). Their exploration relies on a complex two-handed behavior. To better understand the cognitive processes underlying this exploration, we proposed a new method based on “tactile fixations”. A tactile fixation occurs when a finger is stationary within a specific spatial and temporal window. It is known that stationary fingers play an active role when exploring tactile graphics, but they have never been defined or studied before. In this study, we first defined the concept of tactile fixation, then we conducted a behavioral study with ten participants with VI in order to assess the role of tactile fixations under different conditions. The results show that tactile fixations vary according to different factors such as the graphic type as well as the involved hand and the aim of the exploration.2021KZKaixing Zhao et al.University of ToulouseVisual Impairment Technologies (Screen Readers, Tactile Graphics, Braille)CHI
Automation: Danger or Opportunity? Designing and Assessing Automation for Interactive SystemsThis course takes a practical approach to introduce the principles, methods and tools in task modeling and how this technique can support identification of automation opportunities, dangers and limitations. A technical interactive hands-on exercise of how to "do it right", such as: How to go from task analysis to task models? How to identify tasks that are good candidate for automation (through analysis and simulation)? How to identify reliability and usability dangers added by automation? How to design usable automation at system, application and interaction levels? And more...2018PPPhilippe Palanque et al.ICS-IRIT, Université Paul Sabatier Toulouse 3AI-Assisted Decision-Making & AutomationImpact of Automation on WorkCHI
Rolling-Menu: Rapid Command Selection in Toolbars Using Roll Gestures with a Multi-DoF MouseThis paper presents Rolling-Menu, a technique for selecting toolbar items, based on the use of roll gestures with a multidimensional device, the Roly-Poly Mouse (RPM). Rolling-Menu reduces object-command transition, resulting in a better integration between command selection and direct manipulation of application objects. Selecting a toolbar item with Rolling-Menu requires rolling RPM in a predefined direction corresponding to the item. We propose a design space of Rolling-Menu that includes different roll mapping and validation modes. A first user's study, with a simple toolbar containing up to 14 items, establishes that the best version of Rolling-Menu takes, on average, up to 29% less time than the Mouse to select a toolbar item. Moreover accuracy of the selection with Rolling-Menu is above 90%. Both the validation mode and the mapping between roll direction and toolbar items influence the performance of Rolling-Menus. A second study compares the three best versions of Rolling-Menu with the Mouse to select an item in two types of multidimensional toolbars: a toolbar containing dropdown lists, and a grid toolbar. Results confirm the advantage of Rolling-Menu over a Mouse.2018EDEmmanuel Dubois et al.IRITFull-Body Interaction & Embodied InputKnowledge Worker Tools & WorkflowsPrototyping & User TestingCHI