SustainaPrint: Making the Most of Eco-Friendly FilamentsWe present SustainaPrint, a system for integrating eco-friendly filaments into 3D printing without compromising structural integrity. While biodegradable and recycled 3D printing filaments offer environmental benefits, there is a trade-off in using them as they may suffer from degraded or unpredictable mechanical properties, which can limit their use in load-bearing applications. SustainaPrint addresses this by strategically assigning eco-friendly and standard filaments to different regions of a multi-material print—reinforcing the areas that are most likely to break with stronger material while maximizing the use of sustainable filament elsewhere. As eco-friendly filaments often do not come with technical datasheets, we also introduce a low-cost, at-home mechanical testing toolkit that enables users to evaluate filament strength before deciding if they want to use that filament in our pipeline. We validate SustainaPrint through real-world fabrication and mechanical testing, demonstrating its effectiveness across a range of functional 3D printing tasks.2025MPMaxine Perroni-Scharf et al.Desktop 3D Printing & Personal FabricationCustomizable & Personalized ObjectsSustainable HCIUIST
FabObscura: Computational Design and Fabrication for Interactive Barrier-Grid AnimationsWe present FabObscura: a system for creating interactive barrier-grid animations, a classic technique that uses occlusion patterns to create the illusion of motion. Whereas traditional barrier-grid animations are constrained to simple linear occlusion patterns, FabObscura introduces a parameterization that represents patterns as mathematical functions. Our parameterization offers two key advantages over existing barrier-grid animation design methods: first, it has a high expressive ceiling by enabling the systematic design of novel patterns; second, it is versatile enough to represent all established forms of barrier-grid animations. Using this parameterization, our computational design tool enables an end-to-end workflow for authoring, visualizing, and fabricating these animations without domain expertise. Our applications demonstrate how FabObscura can be used to create animations that respond to a range of user interactions, such as translations, rotations, and changes in viewpoint. By formalizing barrier-grid animation as a computational design material, FabObscura extends its expressiveness as an interactive medium.2025TSTicha Sethapakdi et al.Shape-Changing Materials & 4D PrintingCustomizable & Personalized ObjectsDigital Art Installations & Interactive PerformanceUIST
EI-Lite: Electrical Impedance Sensing for Micro-gesture Recognition and Pinch Force EstimationMicro-gesture recognition and fine-grain pinch press enables intuitive and discreet control of devices, offering significant potential for enhancing human-computer interaction (HCI). In this paper, we present EI-Lite, a lightweight wrist-worn electrical impedance sensing device for micro-gesture recognition and continuous pinch force estimation. We elicit an optimal and simplified device architecture through an ablation study on electrode placement with 13 users, and implement the elicited designs through 3D printing. We capture data on 15 participants on (1) six common micro-gestures (plus idle state) and (2) index finger pinch forces, then develop machine learning models that interpret the impedance signals generated by these micro-gestures and pinch forces. Our system is capable of accurate recognition of micro-gesture events (96.33% accuracy), as well as continuously estimating the pinch force of the index finger in physical units (Newton), with the mean-squared-error (MSE) of 0.3071 (or mean-force-variance of 0.55 Newtons) over 15 participants. Finally, we demonstrate EI-Lite's applicability via three applications in AR/VR, gaming, and assistive technologies.2025JZJunyi Zhu et al.Vibrotactile Feedback & Skin StimulationFoot & Wrist InteractionUIST
Meta-antenna: Mechanically Frequency Reconfigurable Metamaterial AntennasWe introduce Meta-antenna, a design and fabrication pipeline for creating frequency reconfigurable antennas while making use of a single type of mechanical metamaterial structure. Unlike traditional static antenna systems with fixed radiation patterns and frequency responses per geometry, Meta-antenna leverages mechanical reconfiguration to alter the radiation and geometry characteristics of the antenna, making it more versatile for sensing and communication. Meta-antenna provides a design space of resonance frequency from 500 MHz to 6.3 GHz ≥10 dB upon the structure's compression, bending, or rotation. Additionally, we provide an Ansys-based editor that allows users to generate metamaterial antenna geometries and simulate their resonance frequency. We also provide a code template for Meta-antenna based sensing interactions. Our technical evaluation demonstrates that our fabricated Meta-antenna structures remain functional even after 10,000 compression cycles. Finally, we contribute three example applications showcasing Meta-antenna’s potential in adaptive personal devices, smart home systems, and tangible user interfaces.2025MAMarwa AlAlawi et al.Circuit Making & Hardware PrototypingCustomizable & Personalized ObjectsUIST
InteRecon: Towards Reconstructing Interactivity of Personal Memorable Items in Mixed RealityDigital capturing of memorable personal items is a key way to archive personal memories. Although current digitization methods (e.g., photos, videos, 3D scanning) can replicate the physical appearance of an item, they often cannot preserve its real-world interactivity. We present Interactive Digital Item (IDI), a concept of reconstructing both the physical appearance and, more importantly, the interactivity of an item. We first conducted a formative study to understand users' expectations of IDI, identifying key physical interactivity features, including geometry, interfaces, and embedded content of items. Informed by these findings, we developed InteRecon, an AR prototype enabling personal reconstruction functions for IDI creation. An exploratory study was conducted to assess the feasibility of using InteRecon and explore the potential of IDI to enrich personal memory archives. Results show that InteRecon is feasible for IDI creation, and the concept of IDI brings new opportunities for augmenting personal memory archives.2025ZLZisu Li et al.The Hong Kong University of Science and Technology, IIP (Computational Media and Arts); MIT CSAILInteractive Narrative & Immersive StorytellingCHI
TactStyle: Generating Tactile Textures with Generative AI for Digital FabricationRecent work in Generative AI enables the stylization of 3D models based on image prompts. However, these methods do not incorporate tactile information, leading to designs that lack the expected tactile properties. We present TactStyle, a system that allows creators to stylize 3D models with images while incorporating the expected tactile properties. TactStyle accomplishes this using a modified image-generation model fine-tuned to generate heightfields for given surface textures. By optimizing 3D model surfaces to embody a generated texture, TactStyle creates models that match the desired style and replicate the tactile experience. We utilize a large-scale dataset of textures to train our texture generation model. In a psychophysical experiment, we evaluate the tactile qualities of a set of 3D-printed original textures and TactStyle's generated textures. Our results show that TactStyle successfully generates a wide range of tactile features from a single image input, enabling a novel approach to haptic design.2025FFFaraz Faruqi et al.MIT CSAILForce Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsGenerative AI (Text, Image, Music, Video)CHI
Xstrings: 3D Printing Cable-Driven Mechanism for Actuation, Deformation, and ManipulationIn this paper, we present Xstrings, a method for designing and fabricating 3D printed objects with integrated cable-driven mechanisms that can be printed in one go without the need for manual assembly. Xstrings supports four types of cable-driven interactions—bend, coil, screw and compress—which are activated by applying an input force to the cables. To facilitate the design of Xstrings objects, we present a design tool that allows users to embed cable-driven mechanisms into object geometries based on their desired interactions by automatically placing joints and cables inside the object. To assess our system, we investigate the effect of printing parameters on the strength of Xstrings objects and the extent to which the interactions are repeatable without cable breakage. We demonstrate the application potential of Xstrings through examples such as manipulable gripping, bionic robot manufacturing, and dynamic prototyping.2025JLJiaji Li et al.MIT, CSAIL; Zhejiang UniversityShape-Changing Interfaces & Soft Robotic MaterialsDesktop 3D Printing & Personal FabricationCHI
WasteBanned: Supporting Zero Waste Fashion Design Through Linked EditsThe commonly used cut-and-sew garment construction process, in which 2D fabric panels are cut from sheets of fabric and assembled into 3D garments, contributes to widespread textile waste in the fashion industry. There is often a significant divide between the design of the garment and the layout of the panels. One opportunity for bridging this gap is the emerging study and practice of zero waste fashion design, which involves creating clothing designs with maximum layout efficiency. Enforcing the strict constraints of zero waste sewing is challenging, as edits to one region of the garment necessarily affect neighboring panels. Based on our formative work to understand this emerging area within fashion design, we present WasteBanned, a tool that combines CAM and CAD to help users prioritize efficient material usage, work within these zero waste constraints, and edit existing zero waste garment patterns. Our user evaluation indicates that our tool helps fashion designers edit zero waste patterns to fit different bodies and add stylistic variation, while creating highly efficient fabric layouts.2024RZRuowang Zhang et al.Ecological Design & Green ComputingFood Culture & Food InteractionUIST
MouthIO: Fabricating Customizable Oral User Interfaces with Integrated Sensing and ActuationThis paper introduces MouthIO, the first customizable intraoral user interface that can be equipped with various sensors and output components. MouthIO consists of an SLA-printed brace that houses a flexible PCB within a bite-proof enclosure positioned between the molar teeth and inner cheeks. Our MouthIO design and fabrication technique enables makers to customize the oral user interfaces in both form and function at low cost. All parts in contact with the oral cavity are made of bio-compatible materials to ensure safety, while the design takes into account both comfort and portability. We demonstrate MouthIO through three application examples ranging from beverage consumption monitoring, health monitoring, to assistive technology. Results from our full-day user study indicate high wearability and social acceptance levels, while our technical evaluation demonstrates the device's ability to withstand adult bite forces.2024YJYijing Jiang et al.Motor Impairment Assistive Input TechnologiesTelemedicine & Remote Patient MonitoringUIST
Speed-Modulated Ironing: High-Resolution Shade and Texture Gradients in Single-Material 3D PrintingWe present Speed-Modulated Ironing, a new fabrication method for programming visual and tactile properties in single-material 3D printing. We use one nozzle to 3D print and a second nozzle to reheat printed areas at varying speeds, controlling the material's temperature-response. The rapid adjustments of speed allow for fine-grained reheating, enabling high-resolution color and texture variations. We implemented our method in a tool that allows users to assign desired properties to 3D models and creates corresponding 3D printing instructions. We demonstrate our method with three temperature-responsive materials: a foaming filament, a filament with wood fibers, and a filament with cork particles. These filaments respond to temperature by changing color, roughness, transparency, and gloss. Our technical evaluation reveals the capabilities of our method in achieving sufficient resolution and color shade range that allows surface details such as small text, photos, and QR codes on 3D-printed objects. Finally, we provide application examples demonstrating the new design capabilities enabled by Speed-Modulated Ironing.2024MOMehmet Ozdemir et al.Desktop 3D Printing & Personal FabricationLaser Cutting & Digital FabricationUIST
BrightMarker: 3D Printed Fluorescent Markers for Object TrackingExisting invisible object tagging methods are prone to low resolution, which impedes tracking performance. We present BrightMarker, a fabrication method that uses fluorescent filaments to embed easily trackable markers in 3D printed color objects. By using an infrared-fluorescent filament that "shifts" the wavelength of the incident light, our optical detection setup filters out all the noise to only have the markers present in the infrared camera image. The high contrast of the markers allows us to track them robustly regardless of the moving objects’ surface color. We built a software interface for automatically embedding these markers for the input object geometry, and hardware modules that can be attached to existing mobile devices and AR/VR headsets. Our image processing pipeline robustly localizes the markers in real-time from the captured images. BrightMarker can be used in a variety of applications, such as custom fabricated wearables for motion capture, tangible interfaces for AR/VR, rapid product tracking, and privacy-preserving night vision. BrightMarker exceeds the detection rate of state-of-the-art invisible marking, and even small markers (1"x1") can be tracked at distances exceeding 2m.2023MDMustafa Doga Dogan et al.Shape-Changing Materials & 4D PrintingCircuit Making & Hardware PrototypingUIST
Style2Fab: Functionality-Aware Segmentation for Fabricating Personalized 3D Models with Generative AIWith recent advances in Generative AI, it is becoming easier to automatically manipulate 3D models. However, current methods tend to apply edits to models globally, which risks compromising the intended functionality of the 3D model when fabricated in the physical world. For example, modifying functional segments in 3D models, such as the base of a vase, could break the original functionality of the model, thus causing the vase to fall over. We introduce a method for automatically segmenting 3D models into functional and aesthetic elements. This method allows users to selectively modify aesthetic segments of 3D models, without affecting the functional segments. To develop this method we first create a taxonomy of functionality in 3D models by qualitatively analyzing 1000 models sourced from a popular 3D printing repository, Thingiverse. With this taxonomy, we develop a semi-automatic classification method to decompose 3D models into functional and aesthetic elements. We propose a system called Style2Fab that allows users to selectively stylize 3D models without compromising their functionality. We evaluate the effectiveness of our classification method compared to human-annotated data, and demonstrate the utility of Style2Fab with a user study to show that functionality-aware segmentation helps preserve model functionality.2023FFFaraz Faruqi et al.3D Modeling & AnimationDesktop 3D Printing & Personal FabricationCustomizable & Personalized ObjectsUIST
MagKnitic: Machine-knitted Passive and Interactive Haptics Textiles with Integrated Binary SensingIn this paper, we introduce \textit{MagKnitic}, a novel approach to integrate passive force feedback and binary sensing into fabrics via digital machine knitting. Our approach utilizes digital fabrication technology to enable haptic interfaces that are soft, flexible, lightweight, and conform to the user's body shape. Despite these characteristics, our interfaces provide diverse, interactive, and responsive force feedback, expanding the design space for haptic experiences. \textit{MagKnitic} provides scalable and customizable passive haptic sensations by utilizing the attractive force between ferromagnetic yarns and permanent magnets, both of which are seamlessly integrated into knitted fabrics. Moreover, we present a binary sensing capability based on the resistance drop resulting from the activated electrical path between the integrated magnets and ferromagnetic yarn upon direct contact. We offer parametric design templates for users to customize \textit{MagKnitic} layouts and patterns. With various design layouts and combinations, \textit{MagKnitic} supports passive haptics interactions of linear, polar, angular, planar, radial, and user-defined motions. We perform a technical evaluation of the passive force feedback and the binary sensing capabilities with different machine knitting layouts and patterns, embedded magnet sizes, and interaction distances. In addition, we conduct two user studies to validate the effectiveness of \textit{MagKnitic}. Finally, we demonstrate various application scenarios, including wearable input interfaces, game controllers, passive VR/AR wearables, and interactive furniture coverings.2023YLYiyue Luo et al.Haptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsUIST
Polagons: Designing and Fabricating Polarized Light Mosaics with User-Defined Color-Changing BehaviorsPolarized light mosaics (PLMs) are color-changing structures that alter their appearance based on the orientation of incident polarized light. While a few artists have developed techniques for crafting PLMs by hand, the underlying material properties are difficult to reason about; there exist no tools to bridge the high-level design objectives with the low-level physics knowledge needed to create PLMs. In this paper, we introduce the first system for creating Polagons: machine-made PLMs crafted from cellophane with user-defined color changing behaviors. Our system includes an interface for designing and visualizing Polagons as well as a fabrication process based on laser cutting and welding that requires minimal assembly by the user. We define the design space for Polagons and demonstrate how formalizing the process for creating PLMs can enable new applications in fields such as education, data visualization, and fashion.2023TSTicha Sethapakdi et al.MIT CSAILLaser Cutting & Digital FabricationCustomizable & Personalized ObjectsCHI
MechSense: A Design and Fabrication Pipeline for Integrating Rotary Encoders into 3D Printed MechanismsWe introduce MechSense, 3D-printed rotary encoders that can be fabricated in one pass alongside rotational mechanisms, and report on their angular position, direction of rotation, and speed. MechSense encoders utilize capacitive sensing by integrating a floating capacitor into the rotating element and three capacitive sensor patches in the stationary part of the mechanism. Unlike existing rotary encoders, MechSense does not require manual assembly but can be seamlessly integrated during design and fabrication. Our MechSense editor allows users to integrate the encoder with a rotating mechanism and exports files for 3D-printing. We contribute a sensor topology and a computational model that can compensate for print deviations. Our technical evaluation shows that MechSense can detect the angular position (mean error: 1.4 degree) across multiple prints and rotations, different spacing between sensor patches, and different sizes of sensors. We demonstrate MechSense through three application examples on 3D-printed tools, tangible UIs, and gearboxes.2023MAMarwa AlAlawi et al.MIT CSAILDesktop 3D Printing & Personal FabricationCircuit Making & Hardware PrototypingCHI
FlexBoard: A Flexible Breadboard for Interaction Prototyping on Curved and Deformable SurfacesWe present FlexBoard, an interaction prototyping platform that enables rapid prototyping with interactive components such as sensors, actuators and displays on curved and deformable objects. FlexBoard offers the rapid prototyping capabilities of traditional breadboards but is also flexible to conform to different shapes and materials. FlexBoard's bendability is enabled by replacing the rigid body of a breadboard with a flexible living hinge that holds the metal strips from a traditional breadboard while maintaining the standard pin spacing. In addition, FlexBoards are also shape-customizable as they can be cut to a specific length and joined together to form larger prototyping areas. We discuss FlexBoard's mechanical design and present a technical evaluation of its bendability, adhesion to curved and deformable surfaces, and holding force of electronic components. Finally, we show the usefulness of FlexBoard through 3 application scenarios with interactive textiles, curved tangible user interfaces, and VR.2023DKDonghyeon Ko et al.MIT CSAILShape-Changing Interfaces & Soft Robotic MaterialsDesktop 3D Printing & Personal FabricationCHI
InStitches: Augmenting Sewing Patterns with Personalized Material-Efficient PracticeThere is a rapidly growing group of people learning to sew online. Without hands-on instruction, these learners are often left to discover the challenges and pitfalls of sewing through trial and error, which can be a frustrating and wasteful process. We present InStitches, a software tool that augments existing sewing patterns with targeted practice tasks to guide users through the skills needed to complete their chosen project. InStitches analyzes the difficulty of sewing instructions relative to a user's reported expertise in order to determine where practice will be helpful and then solves for a new pattern layout that incorporates additional practice steps while optimizing for efficient use of available materials. Our user evaluation indicates that InStitches can successfully identify challenging sewing tasks and augment existing sewing patterns with practice tasks that users find helpful, showing promise as a tool for helping those new to the craft.2023MLMackenzie Leake et al.MIT CSAILCustomizable & Personalized ObjectsMakerspace CultureCHI
Mixels: Fabricating Interfaces using Programmable Magnetic PixelsIn this paper, we present Mixels, programmable magnetic pixels that can be rapidly fabricated using an electromagnetic printhead mounted on an off-the-shelve 3-axis CNC machine. The ability to program magnetic material pixel-wise with varying magnetic force enables Mixels to create new tangible, tactile, and haptic interfaces. To facilitate the creation of interactive objects with Mixels, we provide a user interface that lets users specify the high-level magnetic behavior and that then computes the underlying magnetic pixel assignments and fabrication instructions to program the magnetic surface. Our custom hardware add-on based on an electromagnetic printhead and hall effect sensor clips onto a standard 3-axis CNC machine and can both write and read magnetic pixel values from magnetic material. Our evaluation shows that our system can reliably program and read magnetic pixels of various strengths, that we can predict the behavior of two interacting magnetic surfaces before programming them, that our electromagnet is strong enough to create pixels that utilize the maximum magnetic strength of the material being programmed, and that this material remains magnetized when removed from the magnetic plotter.2022MNMartin Nisser et al.EV Charging & Eco-Driving InterfacesShape-Changing Interfaces & Soft Robotic MaterialsCircuit Making & Hardware PrototypingUIST
MuscleRehab: Improving Unsupervised Physical Rehabilitation by Monitoring and Visualizing Muscle Engagement Unsupervised physical rehabilitation traditionally has used motion tracking to determine correct exercise execution. However, motion tracking is not representative of the assessment of physical therapists, which focus on muscle engagement. In this paper, we investigate if monitoring and visualizing muscle engagement during unsupervised physical rehabilitation improves the execution accuracy of therapeutic exercises by showing users whether they target the right muscle groups. To accomplish this, we use wearable electrical impedance tomography (EIT) to monitor the muscle engagement and visualize the current state on a virtual muscle-skeleton avatar. We use additional optical motion tracking to also monitor the user's movement. We run a user study with 10 participants that compares exercise execution while seeing muscle + motion data vs. motion data only, and also present the recorded data to a group of physical therapists for post-rehabilitation analysis. The results indicate that monitoring and visualizing muscle engagement can improve both the therapeutic exercise accuracy for users during rehabilitation, and post-rehabilitation evaluation for physical therapists.2022JZYunyi Zhu et al.Surgical Assistance & Medical TrainingBiosensors & Physiological MonitoringComputational Methods in HCIUIST
FabO: Integrating Fabrication with a Player's Gameplay in Existing Digital GamesFabricating objects from a player's gameplay, for example, collectibles of valuable game items, or custom game controllers shaped from game objects, expands ways to engage with digital games. Researchers currently create such integrated fabrication games from scratch, which is time-consuming and misses the potential of integrating fabrication with the myriad existing games. Integrating fabrication with the real-time gameplay of existing games, however, is challenging without access to the source files. To address this challenge, we present a framework that uses on-screen visual content to integrate fabrication with existing digital games. To implement this framework, we built the FabO toolkit, in which (1) designers use the FabO designer interface to choose the gameplay moments for fabrication and tag the associated on-screen visual cues; (2) players then use the FabO player interface which monitors their gameplay, identifies these cues and auto-generates the fabrication files for the game objects. Results from our two user studies show that FabO supported in integrating fabrication with diverse games while augmenting players' experience. We discuss insights from our studies on choosing suitable on-screen visual content and gameplay moments for seamless integration of fabrication.2022DTDishita G Turakhia et al.Desktop 3D Printing & Personal FabricationLaser Cutting & Digital FabricationC&C