"Put Your Hands Up": How Joint Attention Is Initiated Between Blind Children And Their Sighted PeersInitiating joint attention (JA) is a fundamental first step in social interactions. In sighted individuals, it relies predominantly on visual cues, such as gaze and hand gestures. These features can reduce opportunities for blind and visually impaired (BVI) and sighted people to interact. Understanding the strategies to navigate these challenges is necessary to develop technology that can facilitate more inclusive JA. To address this, we conducted a longitudinal case study of five children with mixed visual abilities engaging in activities rich with JA opportunities. In a teacher-led classroom, the children experimented with the use of an AI-powered headset designed to support BVI people in social situations. Interaction analysis established that situational complexity affects the children’s responses to initiation attempts. Furthermore, the headset adds to this complexity, affecting the frequency and reactions to attempts to initiate JA. The findings informed the creation of a JA initiation framework and suggestions for future design.2025KJKatherine Jones et al.University of BristolCognitive Impairment & Neurodiversity (Autism, ADHD, Dyslexia)Augmentative & Alternative Communication (AAC)Universal & Inclusive DesignCHI
"It Helps Us Express Our Feelings Without Having To Say Anything": Exploring Accompanying Social Play Things Designed With and For Neurodiverse Groups of ChildrenSocial play is crucial for children's well-being and development. However, many social play technologies fail to address the specific characteristics and needs of neurodiverse play and often overlook divergent play styles. To address this, we first conducted a co-design study with a neurodiverse group of 7 children (Age 7-8) and, based on insights from these sessions, then developed a prototype, ChromaConnect, that allowed children to express their play style to one another during play. To evaluate ChromaConnect's ability to support neurodiverse social play in different contexts, we observed children using it in both structured and unstructured play settings. Our findings show that ChromaConnect enabled children to create a common language of play, made divergent play modes more visible, and facilitated explicit expression of social play initiation. We discuss how these findings could be used to design `accompanying social play things' that are more inclusive of neurodiverse play characteristics and divergent play styles.2025BMBrooke Morris et al.University of Bristol, School of Computer ScienceCognitive Impairment & Neurodiversity (Autism, ADHD, Dyslexia)Universal & Inclusive DesignSpecial Education TechnologyCHI
Understanding Break-ability through Screen-based AffordancesCan J.J. Gibson’s concept of affordances be empirically examined using screen-based technology? We show how screen-based affordances can be examined through the use case of perceptual toughness, i.e. the break-ability of a virtual object. We present two user experiments (n=72, n=66) examining break-ability through a novel ’Perceptual Impact Testing’ methodology and online screen-based 3D virtual environment. We show that judgements of break-ability are systematically distorted when a perceiver’s virtual ‘Point of Observation’ or virtual environment’s ‘Horizonal Geometry’ are manipulated. These statistically significant results provide evidence that: 1) direct perception can account for perceptual distortions of break-ability; 2) Gibsonian affordances can be empirically examined through screen-based interactions.2025RGRichard Grafton et al.University of BristolFull-Body Interaction & Embodied InputVisualization Perception & CognitionCHI
"I Don't Really Get Involved In That Way": Investigating Blind and Visually Impaired Individuals’ Experiences of Joint Attention with Sighted PeopleJoint attention (JA) is a crucial component of social interaction, relying heavily on visual cues like eye gaze and pointing. This creates barriers for blind and visually impaired people (BVI) to engage in JA with sighted peers. Yet, little research has characterised these barriers or the strategies BVI people employ to overcome them. We interviewed ten BVI adults to understand JA experiences and analysed videos of four BVI children with eight sighted partners engaging in activities conducive to JA. Interviews revealed that lack of JA feedback is perceived as voids that block engagement, exacerbated in group settings, with an emphasis on oneself to fill those voids. Video analysis anchored the absence of the person element within typical JA triads, suggesting a potential for technology to foster alternative dynamics between BVI and sighted people. We argue these findings could inform technology design that supports more inclusive JA interactions.2024KJKatherine Jones et al.University of BristolVisual Impairment Technologies (Screen Readers, Tactile Graphics, Braille)Universal & Inclusive DesignCHI
Understanding Neurodiverse Social Play Between Autistic and Non-Autistic ChildrenSocial play supports children to develop essential life skills and foster friendships. However, autistic and non-autistic children often do not have equal opportunities to engage in social play. Previous research to improve these opportunities tends to invoke social skill interventions solely for autistic children or is focused on designing for only one group, rather than considering the interactions or needs of all children in neurodiverse groups. In order to understand the different experiences of children during social play, we conducted interviews with 6 professionals who support neurodiverse social play and undertook observation sessions of 36 autistic and non-autistic children during unstructured social play. Our findings move beyond the existing characterizations of autistic social play and build upon the double empathy problem to capture and consider the needs of all children in neurodiverse playgroups. We argue these findings could be used to inform future neurodiverse social play technology design in HCI.2024BMBrooke Morris et al.University of BristolCognitive Impairment & Neurodiversity (Autism, ADHD, Dyslexia)Universal & Inclusive DesignCollaborative Learning & Peer TeachingCHI
How does HCI Understand Human Agency and Autonomy?Human agency and autonomy have always been fundamental concepts in HCI. New developments, including ubiquitous AI and the growing integration of technologies into our lives, make these issues ever pressing, as technologies increase their ability to influence our behaviours and values. However, in HCI understandings of autonomy and agency remain ambiguous. Both concepts are used to describe a wide range of phenomena pertaining to sense-of-control, material independence, and identity. It is unclear to what degree these understandings are compatible, and how they support the development of research programs and practical interventions. We address this by reviewing 30 years of HCI research on autonomy and agency to identify current understandings, open issues, and future directions. From this analysis, we identify ethical issues, and outline key themes to guide future work. We also articulate avenues for advancing clarity and specificity around these concepts, and for coordinating integrative work across different HCI communities.2023DBDan Bennett et al.University of BristolPrivacy by Design & User ControlTechnology Ethics & Critical HCICHI
Using Virtual Reality and Co-design to Study the Design of Large-scale Shape-Changing InterfacesLarge-scale shape-changing interfaces (SCIs) such as shape-changing walls offer opportunities for enhancing user experiences within buildings, e.g., for navigation. However, due to the embryonic nature of SCI technologies, designing and explaining the shape features that are beneficial to users is challenging. Previous work used virtual platforms (2D video or Projected Augmented Reality) to design SCI. This paper explores how Virtual Reality (VR) can provide an immersive experience that can help in designing large-scale SCI. We follow a co-design approach in which we use VR to obtain users’ impressions of shape-changing walls. Then, we conduct co-design sessions to understand how shape-changing walls can be designed to become ambient and blend with the environment. We report our results to guide the design of shape-changing walls as well as discuss how our approach can provide valuable insights into how a VR experience, prior to design, and can help in the design process.2023LALuluah Albarrak et al.University of BristolShape-Changing Interfaces & Soft Robotic MaterialsPrototyping & User TestingCHI
Multifractal Mice: Inferring Task Engagement and Dimensions of Readiness-to-hand from Hand MovementThe philosophical construct readiness-to-hand describes focused, intuitive, tool use, and has been linked to tool-embodiment and immersion. The construct has been influential in HCI and design for decades, but researchers currently lack appropriate measures and tools to investigate it empirically. To support such empirical work we investigate the possibility of operationalising readiness-to-hand in measurements of multfractality in movement, building on recent work in cognitive science. We conduct two experiments (N=44, N=30) investigating multifractality in mouse movements during a computer game, replicating prior results and contributing new findings. Our results show that multifractality correlates with dimensions associated with readiness-to-hand, including skill and task-engagement, during tool breakdown, task learning and normal play. We describe future possibilities for the application of these methods in HCI, supporting such work by sharing scripts and data, and introducing a new data-driven approach to parameter selection.2022DBDan Bennett et al.University of BristolVisualization Perception & CognitionComputational Methods in HCICHI
It's Touching: Understanding Touch-Affect Association in Shape-Change with Kinematic FeaturesWith the proliferation of shape-change research in affective computing, there is a need to deepen understandings of affective responses to shape-change display. Little research has focused on affective reactions to tactile experiences in shape-change, particularly in the absence of visual information. It is also rare to study response to the shape-change as it unfolds, isolated from a final shape-change outcome. We report on two studies on touch-affect associations, using the crossmodal ``Bouba-Kiki'' paradigm, to understand affective responses to shape-change as it unfolds. We investigate experiences with a shape-change gadget, as it moves between rounded (``Bouba'') and spiky (``Kiki'') forms. We capture affective responses via the circumplex model, and use a motion analysis approach to understand the certainty of these responses. We find that touch-affect associations are influenced by both the size and the frequency of the shape-change and may be modality-dependent, and that certainty in affective associations is influenced by association-consistency.2022FFFeng Feng et al.University of BristolShape-Changing Interfaces & Soft Robotic MaterialsVisualization Perception & CognitionCHI
Feeling Colours: Investigating Crossmodal Correspondences Between 3D Shapes, Colours and EmotionsWith increasing interest in multisensory experiences in HCI there is a need to consider the potential impact of crossmodal correspondences (CCs) between sensory modalities on perception and interpretation. We investigated CCs between active haptic experiences of tangible 3D objects, visual colour and emotion using the "Bouba/Kiki" paradigm. We asked 30 participants to assign colours and emotional categories to 3D-printed objects with varying degrees of angularity and complexity. We found tendencies to associate high degrees of complexity and angularity with red colours, low brightness and high arousal levels. Less complex round shapes were associated with blue colours, high brightness and positive valence levels. These findings contrast previously reported crossmodal effects triggered by 2D shapes of similar angularity and complexity, suggesting that designers cannot simply extrapolate potential perceptual and interpretive experiences elicited by 2D shapes to seemingly similar 3D tangible objects. Instead, we propose a design space for creating tangible multisensory artefacts that can trigger specific emotional percepts and discuss implications for exploiting CCs in the design of interactive technology.2021ALAnan Lin et al.University of BristolShape-Changing Interfaces & Soft Robotic MaterialsVisualization Perception & CognitionSTEM Education & Science CommunicationCHI
Exploring the Design of History-Enriched Floor Interfaces for Asynchronous Navigation SupportEnvironmental cues influence our spatial behaviour when we explore unfamiliar spaces. Research particularly shows that the presence/actions of other people affects our navigation decisions. Here we examine how such social information can be integrated digitally into the environment to support navigation in indoor public spaces. We carried out a study (n=12) to explore how to represent traces of navigation behaviour. We compared 6 floor visualisations and examined how they affect participants' navigational choices. Results suggest that direct representations such as footprints are most informative. To investigate further how such visualisation could work in practice, we implemented an interactive floor system and used it as probe during one-to-one design sessions (n=26). We particularly focused on four design challenges: the overall visual representation, representation of multiple people, designing more prominent visualisations and the incorporation of non-identifying information. Our results provide insights for designers looking to develop history-enriched floor interfaces.2020LALuluah Albarrak et al.Geospatial & Map VisualizationPrototyping & User TestingDIS
Review of Quantitative Empirical Evaluations of Technology for People with Visual ImpairmentsAddressing the needs of visually impaired people is of continued interest in Human Computer Interaction (HCI) research. Yet, one of the major challenges facing researchers in this field continues to be how to design adequate quantitative empirical evaluation for these users in HCI. In this paper, we analyse a corpus of 178 papers on technologies designed for people with visual impairments, published since 1988, and including at least one quantitative empirical evaluation (243 evaluations in total). To inform future research in this area, we provide an overview, historic trends and a unified terminology to design and report quantitative empirical evaluations. We identify open issues and propose a set of guidelines to address them. Our analysis aims to facilitate and stimulate future research on this topic.2020EBEmeline Brulé et al.University of SussexVisual Impairment Technologies (Screen Readers, Tactile Graphics, Braille)User Research Methods (Interviews, Surveys, Observation)Prototyping & User TestingCHI
Robots for Inclusive Play: Co-designing an Educational Game With Visually Impaired and sighted ChildrenDespite being included in mainstream schools, visually impaired children still face barriers to social engagement and participation. Games could potentially help, but games that cater for both visually impaired and sighted players are scarce. We used a co-design approach to design and evaluate a robot-based educational game that could be inclusive of both visually impaired and sighted children in the context of mainstream education. We ran a focus group discussion with visual impairment educators to understand barriers to inclusive play. And then a series of co-design workshops to engage visually impaired and sighted children and educators in learning about robot technology and exploring its potential to support inclusive play experiences. We present design guidelines and an evaluation workshop of a game prototype, demonstrating group dynamics conducive to collaborative learning experiences, including shared goal setting/execution, closely coupled division of labour, and interaction symmetry.2020OMOussama Metatla et al.University of BristolAccessible GamingSerious & Functional GamesSpecial Education TechnologyCHI
Isness: Using Multi-Person VR to Design Peak Mystical Type Experiences Comparable to PsychedelicsStudies combining psychotherapy with psychedelic drugs (?Ds) have demonstrated positive outcomes that are often associated with ?Ds' ability to induce 'mystical-type' experiences (MTEs) – i.e., subjective experiences whose characteristics include a sense of connectedness, transcendence, and ineffability. We suggest that both PsiDs and virtual reality can be situated on a broader spectrum of psychedelic technologies. To test this hypothesis, we used concepts, methods, and analysis strategies from ?D research to design and evaluate 'Isness', a multi-person VR journey where participants experience the collective emergence, fluctuation, and dissipation of their bodies as energetic essences. A study (N=57) analyzing participant responses to a commonly used ?D experience questionnaire (MEQ30) indicates that Isness participants reported MTEs comparable to those reported in double-blind clinical studies after high doses of psilocybin and LSD. Within a supportive setting and conceptual framework, VR phenomenology can create the conditions for MTEs from which participants derive insight and meaning.2020DGDavid R. Glowacki et al.University of Bristol & ArtSci International FoundationSocial & Collaborative VRImmersion & Presence ResearchCHI
Voice User Interfaces in Schools: Co-designing for Inclusion with Visually-Impaired and Sighted PupilsVoice user interfaces (VUIs) are increasingly popular, particularly in homes. However, little research has investigated their potential in other settings, such as schools. We investigated how VUIs could support inclusive education, particularly for pupils with visual impairments (VIs). We organised focused discussions with educators at a school, with support staff from local authorities and, through bodystorming, with a class of 27 pupils. We then ran a series of co-design workshops with participants with mixed-visual abilities to design an educational VUI application. This provided insights into challenges faced by pupils with VIs in mainstream schools, and opened a space for educators, sighted and visually impaired pupils to reflect on and design for their shared learning experiences through VUIs. We present scenarios, a design space and an example application that show novel ways of using VUIs for inclusive education. We also reflect on co-designing with mixed-visual-ability groups in this space.2019OMOussama Metatla et al.University of BristolVoice User Interface (VUI) DesignVoice AccessibilityUniversal & Inclusive DesignCHI
"Like Popcorn": Crossmodal Correspondences Between Scents, 3D Shapes and Emotions in ChildrenThere is increasing interest in multisensory experiences in HCI. However, little research considers how sensory modalities interact with each other and how this may impact interactive experiences. We investigate how children associate emotions with scents and 3D shapes. 14 participants (10-17yrs) completed crossmodal association tasks to attribute emotional characteristics to variants of the "Bouba/Kiki" stimuli, presented as 3D tangible models, in conjunction with lemon and vanilla scents. Our findings support pre-existing mappings between shapes and scents, and confirm the associations between the combination of angular shapes ("Kiki") and lemon scent with arousing emotion, and of round shapes ("Bouba") and vanilla scent with calming emotion. This extends prior work on crossmodal correspondences in terms of stimuli (3D as opposed to 2D shapes), sample (children), and conveyed content (emotions). We outline how these findings can contribute to designing more inclusive interactive multisensory technologies.2019OMOussama Metatla et al.University of BristolHaptic WearablesVisualization Perception & CognitionCHI
Inclusive Education Technologies: Emerging Opportunities for People with Visual ImpairmentsTechnology has become central to many activities of learning, ranging from its use in classroom education to work training, mastering a new hobby, or acquiring new skills of living. While digitally-enhanced learning tools can provide valuable access to information and personalised support, people with specific accessibility needs, such as low or no vision, can often be excluded from their use. This requires technology developers to build more inclusive designs and to offer learning experiences that can be shared by people with mixed-visual abilities. There is also scope to integrate DIY approaches and provide specialised teachers with the ability to design their own low cost educational tools, adapted to pedagogical objectives and to the variety of visual and cognitive abilities of their students. For researchers, this invites new challenges of how to best support technology adoption and its evaluation in often complex educational settings. This workshop seeks to bring together researchers and practitioners interested in accessibility and education to share best practices and lessons learnt for technology in this space; and to jointly discuss and develop future directions for the next generation design of inclusive and effective education technologies.2018OMOussama Metatla et al.University of BristolCognitive Impairment & Neurodiversity (Autism, ADHD, Dyslexia)Aging-Friendly Technology DesignUniversal & Inclusive DesignCHI
"Bursting the Assistance Bubble": Designing Inclusive Technology with Children with Mixed Visual AbilitiesChildren living with visual impairments (VIs) are increasingly educated in mainstream rather than special schools. But knowledge about the challenges they face in inclusive schooling environments and how to design technology to overcome them remains scarce. We report findings from a field study involving interviews and observations of educators and children with/without VIs in mainstream schools, in which we identified the "teaching assistant bubble" as a potential barrier to group learning, social play and independent mobility. We present co-design activities blending elements of future workshops, multisensory crafting, fictional inquiry and bodystorming, demonstrating that children with and without VIs can jointly lead design processes and explore design spaces reflective of mixed visual abilities and shared experiences. We extend previous research by characterising challenges and opportunities for improving inclusive education of children with VIs in mainstream schools, in terms of balancing assistance and independence, and reflect on the process and outcomes of co-designing with mixed-ability groups in this context.2018OMOussama Metatla et al.University of BristolFoot & Wrist InteractionVisual Impairment Technologies (Screen Readers, Tactile Graphics, Braille)Universal & Inclusive DesignCHI
"I Hear You": Understanding Awareness Information Exchange in an Audio-only WorkspaceGraphical displays are a typical means for conveying awareness information in groupware systems to help users track joint activities, but are not ideal when vision is constrained. Understanding how people maintain awareness through non-visual means is crucial for designing effective alternatives for supporting awareness in such situations. We present a lab study simulating an extreme scenario where 32 pairs of participants use an audio-only tool to edit shared audio menus. Our aim is to characterise collaboration in this audio-only space in order to identify whether and how, by itself, audio can mediate collaboration. Our findings show that the means for audio delivery and choice of working styles in this space influence types and patterns of awareness information exchange. We thus highlight the need to accommodate different working styles when designing audio support for awareness, and extend previous research by identifying types of awareness information to convey in response to group work dynamics.2018OMOussama Metatla et al.University of BristolFull-Body Interaction & Embodied InputImmersion & Presence ResearchCHI