Ecolor: Synthesis of EInk Microcapsules for Fabricating DIY DisplaysRecent HCI research has explored active materials for programmable displays, yet accessibility remains a key challenge. While some programmable materials, such as thermochromic ink, are widely available, others—like electronic ink (EInk)—remain confined to industrial production. Despite its versatility, EInk is rarely used in HCI due to complex production processes that require expert knowledge in chemistry. We address this limitation by adapting existing microencapsulation techniques from other fields and identifying barriers to broader adoption. We present a simplified, safe, and more accessible method for producing EInk microcapsules. Through a series of analyses, we evaluate the viability of the resulting EInk and we evaluated the process with six participants unfamiliar with EInk fabrication and found it to be accessible and easy to follow. Our work represents a step toward democratizing EInk, bridging the gap between chemical engineering and practical application in HCI, and enabling broader integration of EInk into the design of diverse interactive devices.2025WZWenda Zhao et al.Shape-Changing Interfaces & Soft Robotic MaterialsOn-Skin Display & On-Skin InputShape-Changing Materials & 4D PrintingUIST
Encounter with the Giants: Understanding Interaction with Large-scale Inflatable Soft RobotsSoft robots, constructed from compliant materials, offer unique flexibility and adaptability. However, most research has focused on small-scale interactions, leaving the potential of large-scale soft robots largely unexplored. This research explores how humans engage with inflatable soft robots that are large in size and created for fun and artistic expression. We conducted 22 hours of video analysis (N=30) and thematic interviews (N=20) to understand user engagement and explore their motivations. Our findings revealed a range of interactions, from delicate touches to immersive full-body engagement, driven by trust, safety, and emotional connection. Participants frequently compared the robots to peaceful creatures like plants and sea animals, fostering playful and therapeutic interactions. These insights highlight the potential of giant soft robots in enhancing emotional well-being, therapeutic applications, and immersive experiences. This paper aims to inspire future designs that leverage the unique attributes of large-scale soft robots for trust-centered, interactive human-robot relationships.2025BBBijetri Biswas Biswas et al.University of Bristol, Faculty of Engineering ; University of Bristol, Bristol Medical SchoolShape-Changing Interfaces & Soft Robotic MaterialsCHI
"You Can Fool Me, You Can't Fool Her!": Autoethnographic Insights from Equine-Assisted Interventions to Inform Therapeutic Robot DesignEquine-Assisted Interventions (EAIs) aim to improve participant health and well-being through the development of a therapeutic relationship with a trained horse. These interventions leverage the horse’s ability to provide emotional feedback, as it responds to negative non-verbal cues with reciprocal negativity, thereby encouraging participants to regulate their emotions and achieve attunement with the horse. Despite their benefits, EAIs face significant challenges, including logistical, financial, and resource constraints, which hinder their widespread adoption and accessibility. To address these issues, we conducted an autoethnographic study of the lead researcher’s engagement in an EAI to investigate the underlying mechanisms and explore potential technological alternatives. Our findings suggest that the reciprocal and responsive non-verbal communication, combined with the horse’s considerable physical presence, supports the potential of an embodied robotic system as a viable alternative. Such a system could offer a scalable and sustainable solution to the current limitations of EAIs.2025EWEllen Weir et al.University of Bristol, Faculty of EngineeringSocial Robot InteractionHuman-Robot Collaboration (HRC)CHI
Understanding Break-ability through Screen-based AffordancesCan J.J. Gibson’s concept of affordances be empirically examined using screen-based technology? We show how screen-based affordances can be examined through the use case of perceptual toughness, i.e. the break-ability of a virtual object. We present two user experiments (n=72, n=66) examining break-ability through a novel ’Perceptual Impact Testing’ methodology and online screen-based 3D virtual environment. We show that judgements of break-ability are systematically distorted when a perceiver’s virtual ‘Point of Observation’ or virtual environment’s ‘Horizonal Geometry’ are manipulated. These statistically significant results provide evidence that: 1) direct perception can account for perceptual distortions of break-ability; 2) Gibsonian affordances can be empirically examined through screen-based interactions.2025RGRichard Grafton et al.University of BristolFull-Body Interaction & Embodied InputVisualization Perception & CognitionCHI
SoftBioMorph: Fabricating Sustainable Shape-changing Interfaces using Soft BiopolymersBio-based and bio-degradable materials have shown promising results for sustainable Human-Computer Interaction (HCI) applications, including shape-changing interfaces. However, the diversity of shape-changing behaviors achievable with these materials remains unclear as the fabrication knowledge is scattered across multiple research fields. This paper introduces SoftBioMorph, a fabrication framework that aims to integrate the fabrication know-how of sustainable soft shape-changing interfaces with biopolymers. Based on the example of Sodium Alginate, the framework contributes (1) a set of material synthesis processes that modify the biopolymer's properties to fulfill different functions; (2) a set of DIY crafting-based assembling techniques that functionalize the material and assembling properties to achieve three primitive types of change in shape; and (3) a series of application cases that demonstrate the versatility of the framework. We further discuss limitations, research questions, and fabrication challenges, presenting a comprehensive approach to sustainable prototyping in HCI.2024MNMadalina Nicolae et al.Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingSustainable HCIDIS
CounterSludge in Alcohol Purchasing on Online Grocery Shopping PlatformsWe investigate how deceptive patterns (sludge) within online grocery shopping can influence the purchase of alcohol through design intervention, and how to counter them. Previous research investigated online shoppers' purchasing behaviors in sustainability and healthy eating. However, current research in alcohol is limited to modifying simulated platforms to aid in the increase of purchasing lower alcoholic beverages by altering product offerings. We conducted a heuristic evaluation on online shopping platforms highlighting the use of sludge, before developing five design intervention prototypes. We then iterated on these interventions through an online alcohol purchasing questionnaire (N=20) and two follow-up activities (N=11) (interview with design probes; product swap questionnaire), evaluating how the interventions could counter sludge. Our goal is to develop interventions that engage light to moderate drinkers in alcohol reduction. We found participants want a greater presence of alcohol units and product grading imagery in conjunction with neutral-toned health warnings.2024EVEszter Vigh et al.Universal & Inclusive DesignPrivacy by Design & User ControlDark Patterns RecognitionDIS
DisplayFab: The State of the Art and a Roadmap in the Personal Fabrication of Free-Form Displays Using Active Materials and Additive Manufacturing.Over recent years, there has been significant research within HCI towards free-form physical interactive devices. However, such devices are not straightforward to design, produce and deploy on demand. Traditional development revolves around iterative prototyping through component-based assembly, limiting device structure and implementation. Material-centric personal display fabrication (DisplayFab) opens the possibility of decentralised, configurable production by low-skill makers. Currently, DisplayFab is severely limited by its embryonic stage of development, the complexity of involved processes and materials, and the challenges around designing interactive structures. We present a development framework to provide a path for future research. DisplayFab has been developed by identifying 4 key breakpoints in the existing “Personal Fabrication” framework: Material and Deposition, Conception and Software, Feedback and Interactivity and Responsible Innovation. We use these breakpoints to form a targeted literature review of relevant work. Doing this we identify 30 challenges that act as roadmap for future research in DisplayFab.2024OHOllie Hanton et al.University of BathShape-Changing Interfaces & Soft Robotic MaterialsDesktop 3D Printing & Personal FabricationLaser Cutting & Digital FabricationCHI
Grip-Reach-Touch-Repeat: A Refined Model of Grasp to Encompass One-Handed Interaction with Arbitrary Form Factor DevicesWe extend grasp models to encompass one-handed interaction with arbitrary shaped touchscreen devices. Current models focus on how objects are stably held by external forces. However, with touchscreen devices, we postulate that users do a trade-off between holding securely and exploring interactively. To verify this, we first conducted a qualitative study which asked participants to grasp 3D printed objects while considering its different interactivity. Results of the study confirm our hypothesis and reveal obvious change in postures. To further verify this trade-off and design interactions, we developed a simulation software capable of computing the stability of a grasp and its reachability. We conducted the second study based on the observed predominant grasps to validate our software with a glove. Results also confirm a consistent trade-off between stability and reachability. We conclude by discussing how this research can help designing computational tools focusing on hand-held interactions with arbitrary shaped touchscreen devices.2024KZKaixing Zhao et al.Northwestern Polytechnical UniversityHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsHand Gesture RecognitionCHI
How does HCI Understand Human Agency and Autonomy?Human agency and autonomy have always been fundamental concepts in HCI. New developments, including ubiquitous AI and the growing integration of technologies into our lives, make these issues ever pressing, as technologies increase their ability to influence our behaviours and values. However, in HCI understandings of autonomy and agency remain ambiguous. Both concepts are used to describe a wide range of phenomena pertaining to sense-of-control, material independence, and identity. It is unclear to what degree these understandings are compatible, and how they support the development of research programs and practical interventions. We address this by reviewing 30 years of HCI research on autonomy and agency to identify current understandings, open issues, and future directions. From this analysis, we identify ethical issues, and outline key themes to guide future work. We also articulate avenues for advancing clarity and specificity around these concepts, and for coordinating integrative work across different HCI communities.2023DBDan Bennett et al.University of BristolPrivacy by Design & User ControlTechnology Ethics & Critical HCICHI
Using Virtual Reality and Co-design to Study the Design of Large-scale Shape-Changing InterfacesLarge-scale shape-changing interfaces (SCIs) such as shape-changing walls offer opportunities for enhancing user experiences within buildings, e.g., for navigation. However, due to the embryonic nature of SCI technologies, designing and explaining the shape features that are beneficial to users is challenging. Previous work used virtual platforms (2D video or Projected Augmented Reality) to design SCI. This paper explores how Virtual Reality (VR) can provide an immersive experience that can help in designing large-scale SCI. We follow a co-design approach in which we use VR to obtain users’ impressions of shape-changing walls. Then, we conduct co-design sessions to understand how shape-changing walls can be designed to become ambient and blend with the environment. We report our results to guide the design of shape-changing walls as well as discuss how our approach can provide valuable insights into how a VR experience, prior to design, and can help in the design process.2023LALuluah Albarrak et al.University of BristolShape-Changing Interfaces & Soft Robotic MaterialsPrototyping & User TestingCHI
Multifractal Mice: Inferring Task Engagement and Dimensions of Readiness-to-hand from Hand MovementThe philosophical construct readiness-to-hand describes focused, intuitive, tool use, and has been linked to tool-embodiment and immersion. The construct has been influential in HCI and design for decades, but researchers currently lack appropriate measures and tools to investigate it empirically. To support such empirical work we investigate the possibility of operationalising readiness-to-hand in measurements of multfractality in movement, building on recent work in cognitive science. We conduct two experiments (N=44, N=30) investigating multifractality in mouse movements during a computer game, replicating prior results and contributing new findings. Our results show that multifractality correlates with dimensions associated with readiness-to-hand, including skill and task-engagement, during tool breakdown, task learning and normal play. We describe future possibilities for the application of these methods in HCI, supporting such work by sharing scripts and data, and introducing a new data-driven approach to parameter selection.2022DBDan Bennett et al.University of BristolVisualization Perception & CognitionComputational Methods in HCICHI
Mold-It: Understanding how Physical Shapes affect Interaction with Handheld Freeform DevicesAdvanced technologies are increasingly enabling the creation of interactive devices with non-rectangular form-factors but it is currently unclear what alternative form-factors are desirable for end users. We contribute an understanding of the interplay between the rationale for the form factors of such devices and their interactive content through think aloud design sessions in which participants could mold devices as they wished using clay. We analysed their qualitative reflections on how the shapes affected interaction. Using thematic analysis, we identified shape features desirable on handheld freeform devices and discuss the particularity of three themes central to the choice of form factors: freeform dexterity, shape features discoverability and shape adaptability (to the task and context). In a second study following the same experimental set-up, we focused on the trade off between dexterity and discoverability and the relation to the concept of affordance. Our work reveals the shape features that impact the most the choice of grasps on freeform devices from which we derive design guidelines for the design of such devices.2022MSMarcos Serrano et al.IRIT - ElipseShape-Changing Interfaces & Soft Robotic MaterialsPrototyping & User TestingCHI
FabricatINK: Personal Fabrication of Bespoke Displays Using Electronic Ink from Upcycled E ReadersFabricatINK explores the personal fabrication of irregularly-shaped low-power displays using electronic ink (E ink). E ink is a programmable bicolour material used in traditional form-factors such as E readers. It has potential for more versatile use within the scope of personal fabrication of custom-shaped displays, and it has the promise to be the pre-eminent material choice for this purpose. We appraise technical literature to identify properties of E ink, suited to fabrication. We identify a key roadblock, universal access to E ink as a material, and we deliver a method to circumvent this by upcycling broken electronics. We subsequently present a novel fabrication method for irregularly-shaped E ink displays. We demonstrate our fabrication process and E ink's versatility through ten prototypes showing different applications and use cases. By addressing E ink as a material for display fabrication, we uncover the potential for users to create custom-shaped truly bistable displays.2022OHOllie Hanton et al.University of BristolDesktop 3D Printing & Personal FabricationCustomizable & Personalized ObjectsCHI
Morphino: A Nature-Inspired Tool for the Design of Shape-Changing InterfacesThe HCI community has a strong and growing interest in shape-changing interfaces (SCIs) that can offer dynamic affordance. In this context, there is an increasing need for HCI researchers and designers to form close relationships with disciplines such as robotics and material science in order to be able to truly harness the state-of-the-art in morphing technologies. To help these synergies arise, we present \textit{Morphino}: a card-based toolkit to inspire shape-changing interface designs. Our cards bring together a collection of morphing mechanisms already established in the multidisciplinary literature and illustrate them through familiar examples from nature. We begin by detailing the design of the cards, based on a review of shape-change in nature; then, report on a series of design sessions conducted to demonstrate their usefulness in generating new ideas and in helping end-users gain a better understanding of the possibilities for shape-changing materials.2020IQIsabel P. S. Qamar et al.Shape-Changing Interfaces & Soft Robotic MaterialsDIS
Exploring the Design of History-Enriched Floor Interfaces for Asynchronous Navigation SupportEnvironmental cues influence our spatial behaviour when we explore unfamiliar spaces. Research particularly shows that the presence/actions of other people affects our navigation decisions. Here we examine how such social information can be integrated digitally into the environment to support navigation in indoor public spaces. We carried out a study (n=12) to explore how to represent traces of navigation behaviour. We compared 6 floor visualisations and examined how they affect participants' navigational choices. Results suggest that direct representations such as footprints are most informative. To investigate further how such visualisation could work in practice, we implemented an interactive floor system and used it as probe during one-to-one design sessions (n=26). We particularly focused on four design challenges: the overall visual representation, representation of multiple people, designing more prominent visualisations and the incorporation of non-identifying information. Our results provide insights for designers looking to develop history-enriched floor interfaces.2020LALuluah Albarrak et al.Geospatial & Map VisualizationPrototyping & User TestingDIS
Sprayable User Interfaces: Prototyping Large-Scale Interactive Surfaces with Sensors and DisplaysWe present Sprayable User Interfaces: room-sized interactive surfaces that contain sensor and display elements created by airbrushing functional inks. Since airbrushing is inherently mobile, designers can create large-scale user interfaces on complex 3D geometries where existing stationary fabrication methods fail. To enable Sprayable User Interfaces, we developed a novel design and fabrication pipeline that takes a desired user interface layout as input and automatically generates stencils for airbrushing the layout onto a physical surface. After fabricating stencils from cardboard or projecting stencils digitally, designers spray each layer with an airbrush, attach a microcontroller to the user interface, and the interface is ready to be used. Our technical evaluation shows that Sprayable User Interfaces work on various geometries and surface materials, such as porous stone and rough wood. We demonstrate our system with several application examples including interactive smart home applications on a wall and a soft leather sofa, an interactive smart city application, and interactive architecture in public office spaces.2020MWMichael Wessely et al.Massachusetts Institute of TechnologyShape-Changing Interfaces & Soft Robotic MaterialsOn-Skin Display & On-Skin InputSmart Cities & Urban SensingCHI
ProtoSpray: Combining 3D Printing and Spraying to Create Interactive Displays with Arbitrary ShapesProtoSpray is a fabrication method that combines 3D printing and spray coating, to create interactive displays of arbitrary shapes. Our approach makes novel use of 3D printed conductive channels to create base electrodes on 3D shapes. This is then combined with spraying active materials to produce illumination. We demonstrate the feasibility and benefits of this combined approach in 6 evaluations exploring different shaped topologies. We analyze factors such as spray orientations, surface topologies and printer resolutions, to discuss how spray nozzles can be integrated into traditional 3D printers. We present a series of ProtoSprayed objects demonstrating how our technique goes beyond existing fabrication techniques by allowing creation of displays on objects with curvatures as complex as a Mobius strip. Our work provides a platform to empower makers to use displays as a fabrication material.2020OHOllie Hanton et al.University of BristolDesktop 3D Printing & Personal FabricationShape-Changing Materials & 4D PrintingCHI
Skin-On Interfaces: A Bio-Driven Approach for Artificial Skin Design to Cover Interactive DevicesWe propose a paradigm called Skin-On interfaces, in which interactive devices have their own (artificial) skin, thus enabling new forms of input gestures for end-users (e.g. twist, scratch). Our work explores the design space of Skin-On interfaces by following a bio-driven approach: (1) from a sensory point of view, we study how to reproduce the look and feel of the human skin through three user studies; (2) from a gestural point of view, we study what kind of gestures such interfaces enable by looking at how human beings use real skin as a communication medium or in interfaces; (3) from a technical point of view, we explore and discuss different ways of fabricating interfaces that mimic human skin sensitivity and can recognize the gestures observed in the previous study; (4) we assemble the insights of our three exploratory facets to implement a series of Skin-On interfaces and also contribute by providing a toolkit that enables easy reproduction and fabrication2019MTMarc Teyssier et al.Haptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
PickCells: A Physically Reconfigurable Cell-composed TouchscreenTouchscreens are the predominant medium for interactions with digital services; however, their current fixed form factor narrows the scope for rich physical interactions by limiting interaction possibilities to a single, planar surface. In this paper we introduce the concept of PickCells, a fully re-configurable device concept composed of cells, that breaks the mould of rigid screens and explores a modular system that affords rich sets of tangible interactions and novel across-device relationships. Through a series of co-design activities -- involving HCI experts and potential end-users of such systems -- we synthesised a design space aimed at inspiring future research, giving researchers and designers a framework in which to explore modular screen interactions. The design space we propose unifies existing works on modular touch surfaces under a general framework and broadens horizons by opening up unexplored spaces providing new interaction possibilities. In this paper, we present the PickCells concept, a design space of modular touch surfaces, and propose a toolkit for quick scenario prototyping.2019AGAlix Goguey et al.Swansea UniversityShape-Changing Interfaces & Soft Robotic MaterialsCustomizable & Personalized ObjectsCHI
Mantis: A Scalable, Lightweight and Accessible Architecture to Build Multiform Force Feedback SystemsMantis is a highly scalable system architecture that democratizes haptic devices by enabling designers to create accurate, multiform and accessible force feedback systems. Mantis uses brushless DC motors to control a laser-cut actuated arm, custom electronic controllers, and an admittance control scheme to achieve stable high-quality haptic rendering. It enables common desktop form factors but also: large workspaces (multiple arm lengths); multiple arm workspaces; and mobile workspaces. It also uses accessible components and costs significantly less than typical high-fidelity force feedback solutions which are often confined to labs. We present our design and show that Mantis can reproduce the haptic fidelity of common robotic arms. We demonstrate its multiform ability by implementing five systems: a single desktop-sized device, a single large workspace device, a large workspace system with 4 points of feedback; a mobile system and a wearable one.2019GBGareth Barnaby et al.Force Feedback & Pseudo-Haptic WeightUIST