Towards Unobtrusive Physical AI: Augmenting Everyday Objects with Intelligence and Robotic Movement for Proactive AssistanceUsers constantly interact with physical, most often passive, objects. Consider if familiar objects instead proactively assisted users, e.g., a stapler moving across the table to help users organize documents, or a knife moving away to prevent injury as the user is inattentively about to lean against the countertop. In this paper, we build on the qualities of tangible interaction and focus on recognizing user needs in everyday tasks to enable ubiquitous yet unobtrusive tangible interaction. To achieve this, we introduce an architecture that leverages Large Language Models~(LLMs) to perceive users’ environment and activities, perform spatial-temporal reasoning, and generate object actions aligned with inferred user intentions and object properties. We demonstrate the system’s utility providing proactive assistance with multiple objects and in various daily scenarios. To evaluate our system components, we compare our system-generated output for user goal estimation and object action recommendation with human-annotated baselines, with results indicating good agreement.2025VHViolet Yinuo Han et al.Ubiquitous ComputingCommunity Engagement & Civic TechnologyUIST
Sculptable Mesh Structures for Large-Scale Form-FindingIt can be hard to design a physical structure entirely within the confines of a computer monitor. To better capture the interplay between real-world objects and a designer’s work-in-progress, practitioners will often go through a sequence of low-fidelity prototypes (paper, clay, foam) before arriving at a form that satisfies both functional and aesthetic concerns. While necessary, this model-making process can be quite time-consuming, particularly at larger scales, and the resulting geometry can be difficult to translate into a CAD environment, where it will be further refined. This paper introduces a user-adjustable, room-scale, "shape-aware" mesh structure for low-fidelity prototyping. A user physically manipulates the mesh by lengthening and shortening the edges, altering the overall curvature and sculpting coarse forms. The edges are equipped with resistive length sensors, and transmit their configuration to a central computer. The structure can later be reproduced in software, connecting this prototyping stage to the larger computational design pipeline.2025JGJesse T Gonzalez et al.Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingUIST
Personalized Bistable Orthoses for Rehabilitation of Finger JointsOrthoses are essential components of rehabilitation yet limited in functionality. Static braces immobilize joints, which especially for hand and finger injuries, interfere with users’ daily activities. Additionally, early mobilization schedules require users to take off and reapply their static orthoses frequently, which is cumbersome. To facilitate both rehabilitation and dexterity, we introduce a novel multifunctional yet unpowered finger orthosis design. Our design supports easy switching between two distinct states: a stiff state for immobilization and a flexible state for mobilization. A key benefit is that it can be customized using our computational design tool, and 3D printed in one piece. Our computational design pipeline supports tailoring the switching thresholds of the brace based on patients’ individual finger strengths and range of motion. Following a preliminary study with 10 healthy people that validates the usability and wearability of the brace, our two-week case study with a patient indicates that our brace supports everyday activities and assists with rehabilitation.2025YLYuyu Lin et al.Desktop 3D Printing & Personal FabricationCircuit Making & Hardware PrototypingPrototyping & User TestingUIST
Transforming Everyday Objects into Dynamic Interfaces using Smart Flat-Foldable StructuresDynamic physical interfaces are often dedicated devices designed to adapt their physical properties to user needs. In this paper, we present an actuation system that allows users to transform their existing objects into dynamic physical user interfaces. We design our actuation system to integrate as a self-contained locomotion layer into existing objects that are small-scale, i.e., hand-size rather than furniture-size. We envision that such objects can act as collaborators: as a studio assistant in a painter's palette, as tutors in a student's ruler, or as caretakers for plants evading direct sunlight. The key idea is to decompose the actuation into (1) energy input and (2) steering to achieve a flat form factor. The energy input is provided by simple vibration. We implement steering through differential friction controlled by flat-foldable compliant structures that can be activated electrically. We study the mechanism and its performance, and show its application scenarios enabling dynamic interactions with objects.2025VHViolet Yinuo Han et al.Shape-Changing Interfaces & Soft Robotic MaterialsHand Gesture RecognitionUIST
A Dynamic Bayesian Network Based Framework for Multimodal Context-Aware InteractionsMultimodal context-aware interactions integrate multiple sensory inputs, such as gaze, gestures, speech, and environmental signals, to provide adaptive support across diverse user contexts. Building such systems is challenging due to the complexity of sensor fusion, real-time decision-making, and managing uncertainties from noisy inputs. To address these challenges, we propose a hybrid approach combining a dynamic Bayesian network (DBN) with a large language model (LLM). The DBN offers a probabilistic framework for modeling variables, relationships, and temporal dependencies, enabling robust, real-time inference of user intent, while the LLM incorporates world knowledge for contextual reasoning beyond explicitly modeled relationships. We demonstrate our approach with a tri-level DBN implementation for tangible interactions, integrating gaze and hand actions to infer user intent in real time. A user evaluation with 10 participants in an everyday office scenario showed that our system can accurately and efficiently infer user intentions, achieving 0.83 per frame accuracy, even in complex environments. These results validate the effectiveness of the DBN+LLM framework for multimodal context-aware interactions.2025VHJoel Chan et al.Context-Aware ComputingComputational Methods in HCIIUI
Wearable Material Properties: Passive Wearable Microstructures as Adaptable Interfaces for the Physical EnvironmentUsers interact with static objects daily, but their preferences and needs may vary. Making the objects dynamic or adaptable requires updating all objects. Instead, we propose a novel wearable interface that empowers users to adjust perceived material properties. To explore such wearable interfaces, we design unit cell structures that can be tiled to create surfaces with switchable properties. Each unit can be switched between two states while worn, through an integrated bistable spring and tendon-driven trigger mechanism. Our switchable properties include stiffness, height, shape, texture, and their combinations. Our wearable material interfaces are passive, 3D printed, and personalizable. We present a design tool to support users in designing their customized wearable material properties. We demonstrate several example prototypes, e.g., a sleeve allowing users to adapt to how different surfaces feel, a shoe sole for users walking on different ground conditions, a prototype supporting both pillow and protective helmet properties, or a collar that can be transformed into a neck pillow with variable support.2025YLYuyu Lin et al.Carnegie Mellon UniversityHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsCustomizable & Personalized ObjectsCHI
Robotic Metamaterials: A Modular System for Hands-On Configuration of Ad-Hoc Dynamic ApplicationsWe propose augmenting initially passive structures built from simple repeated cells, with novel active units to enable dynamic, shape-changing, and robotic applications. Inspired by metamaterials that can employ mechanisms, we build a framework that allows users to configure cells of this passive structure to allow it to perform complex tasks. A key benefit is that our structures can be repeatedly (re)configured by users inserting our configuration units to turn the passive material into, e.g., locomotion robots, integrated motion platforms, or interactive interfaces, as we demonstrate in this paper. To this end, we present a mechanical system consisting of a flexible, passive, shearing lattice structure, as well as rigid and active unit cells to be inserted into the lattice for configuration. The active unit is a closed-loop pneumatically controlled shearing cell to dynamically actuate the macroscopic movement of the structure. The passive rigid cells redirect the forces to create complex motion with a reduced number of active cells. Since the placement of the rigid and active units is challenging, we offer a computational design tool. The tool optimizes the cell placement to match the macroscopic, user-defined target motions and generates the control code for the active cells.2024ZCZhitong Cui et al.Carnegie Mellon UniversityShape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingCHI
ConeAct: A Multistable Actuator for Dynamic MaterialsComplex actuators in a small form factor are essential for dynamic interfaces. In this paper, we propose ConeAct, a cone-shaped actuator that can extend, contract, and bend in multiple directions to support rich expression in dynamic materials. A key benefit of our actuator is that it is self-contained and portable as the whole system. We designed our actuator’s structure to be multistable to hold its shape passively, while we control its transition between states using active materials, i.e., shape memory alloys. We present the design space by showcasing our actuator module as part of self-rolling robots, reconfigurable deployable structures, volumetric shape-changing objects and tactile displays. To assist users in designing such structures, we present an interactive editor including simulation to design such interactive capabilities.2024YLJessica Lin et al.Carnegie Mellon UniversityForce Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsCHI
Constraint-Driven Robotic Surfaces, at Human-ScaleRobotic surfaces, whose form and function are under computational control, offer exciting new possibilities for environments that can be customized to fit user-specific needs. When these surfaces can be reprogrammed, a once-static structure can be repurposed to serve multiple different roles over time. In this paper, we introduce such a system. This is an architectural-scale robotic surface, which is able to begin in a neutral state, assume a desired functional shape, and later return to its neutral (flat) position. The surface can then assume a completely different functional shape, all under program control. Though designed for large-scale applications, our surface uses small, power-efficient constraints to reconfigure itself dynamically. The driving actuation force, instead of being positioned at each "joint" of the structure, is relocated to outer edges of the surface. Within the work presented here, we illustrate the design and implementation of such a surface, showcase a number of human-scale example functional forms that can be achieved (such as dynamic furniture), and present technical evaluations of the results.2023JGJesse T Gonzalez et al.Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingUIST
Reprogrammable Digital Metamaterials for Interactive DevicesWe present digital mechanical metamaterials that enable multiple computation loops and reprogrammable logic functions, making a significant step towards passive yet interactive devices. Our materials consist of many cells that transmit signals using an embedded bistable spring. When triggered, the bistable spring displaces and triggers the next cell. We integrate a recharging mechanism to recharge the bistable springs, enabling multiple computation rounds. Between the iterations, we enable reprogramming the logic functions after fabrication. We demonstrate that such materials can trigger a simple controlled actuation anywhere in the material to change the local shape, texture, stiffness, and display. This enables large-scale interactive and functional materials with no or a small number of external actuators. We showcase the capabilities of our system with various examples: a haptic floor with tunable stiffness for different VR scenarios, a display with easy-to-reconfigure messages after fabrication, or a tactile notification integrated into users’ desktops.2023YJYu Jiang et al.Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingUIST
Parametric Haptics: Versatile Geometry-based Tactile Feedback DevicesHaptic feedback is important for immersive, assistive, or multimodal interfaces, but engineering devices that generalize across applications is notoriously difficult. To address the issue of versatility, we propose Parametric Haptics, geometry-based tactile feedback devices that are customizable to render a variety of tactile sensations. To achieve this, we integrate the actuation mechanism with the tactor geometry into passive 3D printable patches, which are then connected to a generic wearable actuation interface consisting of micro gear motors. The key benefit of our approach is that the 3D-printed patches are modular, can consist of varying numbers and shapes of tactors, and that the tactors can be grouped and moved by our actuation geometry over large areas of the skin. The patches are soft, thin, conformable, and easy to customize to different use cases, thus potentially enabling a large design space of diverse tactile sensations. In our user study, we investigate the mapping between geometry parameters of our haptic patches and users’ tactile perceptions. Results indicate a good agreement between our parameters and the reported sensations, showing initial evidence that our haptic patches can produce a wide range of sensations for diverse use scenarios. We demonstrate the utility of our approach with wearable prototypes in immersive Virtual Reality (VR) scenarios, embedded into wearable objects such as glasses, and as wearable navigation and notification interfaces. We support designing such patches with a design tool in Rhino.2023VHViolet Yinuo Han et al.Haptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
MiuraKit: A Modular Hands-On Construction Kit For Pneumatic Shape-Changing And Robotic InterfacesBuilding shape-changing, robotic or deployable interfaces is notoriously difficult, often requiring fabrication skills or specialized hardware. We present a construction kit for novice users to enable immediate hands-on exploration of custom shape-changing or robotic structures through pneumatically actuated origami tubes. Our construction kit consists of generic origami actuators that can be combined with our connectors to result in a variety of shapes and motions. Our novel connectors support simple mechanical connections through snap fits and pneumatic configurations through plugs. To assist users in designing and previewing complex deformations, we provide a design tool that generates control code for user-defined designs. We envision that our construction kit will facilitate creativity support tasks (e.g. product design) or education. We demonstrate the capabilities of our construction kit with three application examples and a series of objects created during our co-creation study with novice users.2023ZCZhitong Cui et al.Shape-Changing Interfaces & Soft Robotic MaterialsDesktop 3D Printing & Personal FabricationDIS
Reconfigurable Elastic MetamaterialsWe present a novel design for materials that are reconfigurable by end-users. Conceptually, we propose decomposing such reconfigurable materials into (1) a generic, complex material consisting of engineered microstructures (known as metamaterials) designed to be purchased and (2) a simple configuration geometry that can be fabricated by end-users to fit their individual use cases. Specifically, in this paper we investigate reconfiguring our material’s elasticity, such that it can cover existing objects and thereby augment their material properties. Users can configure their materials by generating the configuration geometry using our interactive editor, 3D printing it using commonly available filaments (e. g., PLA), and pressing it onto the generic material for local coupling. We characterize the mechanical properties of our reconfigurable elastic metamaterial and showcase the material’s applicability as, e.g., augmentation for haptic props in virtual reality, a reconfigurable shoe sole for different activities, or a battleship-like ball game.2022WYHumphrey Yang et al.Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingUIST