StitchFlow: Enabling In-Situ Creative Explorations of Crochet Patterns With Stitch Tracking and Process SharingCrochet is a tactile craft that has resisted automation, remaining a manual activity characterized by improvisation and adaptation. While crafting, practitioners enter a state of creative flow, becoming fully immersed in their work. However, tasks like documenting patterns, tracking progress, and backtracking due to mistakes or mid-process changes can disrupt their creativity flow. Drawing on the concept of creative flow, a state of total engagement with clear goals, immediate feedback, effortless attention, and spontaneity, we build StitchFlow. This system enables crocheters to remain immersed in their craft while automatically constructing process documentation and allowing them to edit and share it in multiple ways. StitchFlow supports in situ stitching without distraction or the need to remember previous steps using a motion sensor to track real-time hand gestures and reconstruction of the stitch pattern. The created designs can be viewed, edited, and combined, promoting variation-making and design alternations through a graphical interface. To foster the sharing of the results with others, the system supports composing and exporting the documentation using traditional methods like written patterns and crochet charts or as flows that other users can follow within the system. Through user studies with 8 crocheters, we found that StitchFlow preserved makers' creative flow, enabled spontaneous exploration, and facilitated pattern sharing.2025ZMZofia Marciniak et al.Aging-Friendly Technology DesignCustomizable & Personalized ObjectsUIST
Juggling Extra Limbs: Identifying Control Strategies for Supernumerary Multi-Arms in Virtual RealityUsing supernumerary multi-limbs for complex tasks is a growing research focus in Virtual Reality (VR) and robotics. Understanding how users integrate extra limbs with their own to achieve shared goals is crucial for developing efficient supernumeraries. This paper presents an exploratory user study (N=14) investigating strategies for controlling virtual supernumerary limbs with varying autonomy levels in VR object manipulation tasks. Using a Wizard-of-Oz approach to simulate semi-autonomous limbs, we collected both qualitative and quantitative data. Results show participants adapted control strategies based on task complexity and system autonomy, affecting task delegation, coordination, and body ownership. Based on these findings, we propose guidelines—commands, demonstration, delegation, and labeling instructions—to improve multi-limb interaction design by adapting autonomy to user needs and fostering better context-aware experiences.2025HZHongyu Zhou et al.The University of Sydney, School of Computer ScienceShape-Changing Interfaces & Soft Robotic MaterialsFull-Body Interaction & Embodied InputCHI
3D Printing Locally Activated Visual-Displays Embedded in 3D Objects via Electrically Conductive and Thermochromic Materials3D printed displays promise to create unique visual interfaces for physical objects. However, current methods for creating 3D printed displays either require specialized post-fabrication processes (e.g., electroluminescence spray and silicon casting) or function as passive elements that simply react to environmental factors (e.g., body and air temperature). These passive displays offer limited control over when, where, and how the colors change. In this paper, we introduce ThermoPixels, a method for designing and 3D printing actively controlled and visually rich thermochromic displays that can be embedded in arbitrary geometries. We investigate the color-changing and thermal properties of thermochromic and conductive filaments. Based on these insights, we designed ThermoPixels and an accompanying software tool that allows embedding ThermoPixels in arbitrary 3D geometries, creating displays of various shapes and sizes (flat, curved, or matrix displays) or displays that embed textures, multiple colors, or that are flexible.2024KMKongpyung (Justin) Moon et al.KAISTDesktop 3D Printing & Personal FabricationCustomizable & Personalized ObjectsCHI
Reconfigurable Interfaces by Shape Change and Embedded MagnetsReconfigurable physical interfaces empower users to swiftly adapt to tailored design requirements or preferences. Shape-changing interfaces enable such reconfigurability, avoiding the cost of refabrication or part replacements. Nonetheless, reconfigurable interfaces are often bulky, expensive, or inaccessible. We propose a reversible shape-changing mechanism that enables reconfigurable 3D printed structures via translations and rotations of parts. We investigate fabrication techniques that enable reconfiguration using magnets and the thermoplasticity of heated polymer. Proposed interfaces achieve tunable haptic feedback and adjustment of different user affordances by reconfiguring input motions. The design space is demonstrated through applications in rehabilitation, embodied communication, accessibility, safety, and gaming.2024HDHimani Deshpande et al.Texas A&M UniversityForce Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingCHI
Big or Small, It’s All in Your Head: Visuo-Haptic Illusion of Size-Change Using Finger-RepositioningHaptic perception of physical sizes increases the realism and immersion in Virtual Reality (VR). Prior work rendered sizes by exerting pressure on the user’s fingertips or employing tangible, shape-changing devices. These interfaces are constrained by the physical shapes they can assume, making it challenging to simulate objects growing larger or smaller than the perceived size of the interface. Motivated by literature on pseudo-haptics describing the strong influence of visuals over haptic perception, this work investigates modulating the perception of size beyond this range. We developed a fixed-sized VR controller leveraging finger-repositioning to create a visuo-haptic illusion of dynamic size-change of handheld virtual objects. Through two user studies, we found that with an accompanying size-changing visual context, users can perceive virtual object sizes up to 44.2% smaller to 160.4%larger than the perceived size of the device. Without the accompanying visuals, a constant size (141.4% of device size) was perceived.2024MKMyung Jin Kim et al.KAISTForce Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsFull-Body Interaction & Embodied InputCHI
FlowAR: How Different Augmented Reality Visualizations of Online Fitness Videos Support Flow for At-Home Yoga ExercisesOnline fitness video tutorials are an increasingly popular way to stay fit at home without a personal trainer. However, to keep the screen playing the video in view, users typically disrupt their balance and break the motion flow --- two main pillars for the correct execution of yoga poses. While past research partially addressed this problem, these approaches supported only a limited view of the instructor and simple movements. To enable the fluid execution of complex full-body yoga exercises, we propose FlowAR, an augmented reality system for home workouts that shows training video tutorials as always-present virtual static and dynamic overlays around the user. We tested different overlay layouts in a study with 16 participants, using motion capture equipment for baseline performance. Then, we iterated the prototype and tested it in a furnished lab simulating home settings with 12 users. Our results highlight the advantages of different visualizations and the system's general applicability.2023HJHye-Young Jo et al.KAISTAR Navigation & Context AwarenessFitness Tracking & Physical Activity MonitoringCHI
ShrinkCells: Localized and Sequential Shape-Changing Actuation of 3D-Printed Objects via Selective HeatingThe unique behaviors of thermoplastic polymers enable shape-changing interfaces made of 3D printed objects that do not require complex electronics integration. While existing techniques greatly rely on external heat applied globally on a 3D printed object to initiate all at once the shape-changing behavior (e.g., hot water, heat gun, oven), independent control of multiple parts of the object becomes nearly impossible. We introduce ShrinkCells, a set of shape-changing actuators that rely on localized heat to shrink or bend. This is achieved by combining the properties of two materials --- conductive PLA is used to generate localized heat that selectively triggers the shrinking of a Shape Memory Polymer. The unique benefit of ShrinkCells is their capability of triggering simultaneous or sequential shape transformations for different geometries using a single power supply. The result is 3D printed rigid structures that actuate in sequence, avoiding self-collisions when unfolding. We contribute to the body of literature on 4D fabrication by a systematic investigation of selective heating with two different materials, the design and evaluation of the ShrinkCells shape-changing primitives, and applications demonstrating the usage of these actuators.2022KMKongpyung Moon et al.Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingUIST
SpinOcchio: Understanding Haptic-Visual Congruency of Skin-Slip in VR with a Dynamic Grip ControllerThis paper's goal is to understand the haptic-visual congruency perception of skin-slip on the fingertips given visual cues in Virtual Reality (VR). We developed SpinOcchio ('Spin' for the spinning mechanism used, 'Occhio' for the Italian word “eye”), a handheld haptic controller capable of rendering the thickness and slipping of a virtual object pinched between two fingers. This is achieved using a mechanism with spinning and pivoting disks that apply a tangential skin-slip movement to the fingertips. With SpinOcchio, we determined the baseline haptic discrimination threshold for skin-slip, and, using these results, we tested how haptic realism of motion and thickness is perceived with varying visual cues in VR. Surprisingly, the results show that in all cases, visual cues dominate over haptic perception. Based on these results, we suggest applications that leverage skin-slip and grip interaction, contributing further to realistic experiences in VR.2022MKMyung Jin Kim et al.KAISTHaptic WearablesBrain-Computer Interface (BCI) & NeurofeedbackCHI
MyDJ: Sensing Food Intakes with an Attachable on Your Eyeglass FrameVarious automated eating detection wearables have been proposed to monitor food intakes. While these systems overcome the forgetfulness of manual user journaling, they typically show low accuracy at outside-the-lab environments or have intrusive form-factors (e.g., headgear). Eyeglasses are emerging as a socially-acceptable eating detection wearable, but existing approaches require custom-built frames and consume large power. We propose MyDJ, an eating detection system that could be attached to any eyeglass frame. MyDJ achieves accurate and energy-efficient eating detection by capturing complementary chewing signals on a piezoelectric sensor and an accelerometer. We evaluated the accuracy and wearability of MyDJ with 30 subjects in uncontrolled environments, where six subjects attached MyDJ on their own eyeglasses for a week. Our study shows that MyDJ achieves 0.919 F1-score in eating episode coverage, with 4.03× battery time over the state-of-the-art systems. In addition, participants reported wearing MyDJ was almost as comfortable (94.95%) as wearing regular eyeglasses.2022JSJaemin Shin et al.KAISTSmartwatches & Fitness BandsBiosensors & Physiological MonitoringCHI
Elevate: A Walkable Pin-Array for Large Shape-Changing TerrainsCurrent head-mounted displays enable users to explore virtual worlds by simply walking through them (i.e., real-walking VR). This led researchers to create haptic displays that can also simulate different types of elevation shapes. However, existing shape-changing floors are limited by their tabletop scale or the coarse resolution of the terrains they can display due to the limited number of actuators and low vertical resolution. To tackle this challenge, we introduce Elevate, a dynamic and walkable pin-array floor on which users can experience not only large variations in shapes but also the details of the underlying terrain. Our system achieves this by packing 1200 pins arranged on a 1.80 x 0.60m platform, in which each pin can be actuated to one of ten height levels (resolution: 15mm/level). To demonstrate its applicability, we present our haptic floor combined with four walkable applications and a user study that reported increased realism and enjoyment.2021SJSeungwoo Je et al.KAISTShape-Changing Interfaces & Soft Robotic MaterialsFull-Body Interaction & Embodied InputCHI
GamesBond: Bimanual Haptic Illusion of Physically Connected Objects for Immersive VR Using Grip DeformationVirtual Reality experiences, such as games and simulations, typically support the usage of bimanual controllers to interact with virtual objects. To recreate the haptic sensation of holding objects of various shapes and behaviors with both hands, previous researchers have used mechanical linkages between the controllers that render adjustable stiffness. However, the linkage cannot quickly adapt to simulate dynamic objects, nor it can be removed to support free movements. This paper introduces GamesBond, a pair of 4-DoF controllers without physical linkage but capable to create the illusion of being connected as a single device, forming a virtual bond. The two controllers work together by dynamically displaying and physically rendering deformations of hand grips, and so allowing users to perceive a single connected object between the hands, such as a jumping rope. With a user study and various applications we show that GamesBond increases the realism, immersion, and enjoyment of bimanual interaction.2021NRNeung Ryu et al.KAISTIn-Vehicle Haptic, Audio & Multimodal FeedbackShape-Changing Interfaces & Soft Robotic MaterialsFull-Body Interaction & Embodied InputCHI
ToonNote: Improving Communication in Computational Notebooks Using Interactive Data ComicsComputational notebooks help data analysts analyze and visualize datasets, and share analysis procedures and outputs. However, notebooks typically combine code (e.g., Python scripts), notes, and outputs (e.g., tables, graphs). The combination of disparate materials is known to hinder the comprehension of notebooks, making it difficult for analysts to collaborate with other analysts unfamiliar with the dataset. To mitigate this problem, we introduce ToonNote, a JupyterLab extension that enables the conversion of notebooks into "data comics.'' ToonNote provides a simplified view of a Jupyter notebook, highlighting the most important results while supporting interactive and free exploration of the dataset. This paper presents the results of a formative study that motivated the system, its implementation, and an evaluation with 12 users, demonstrating the effectiveness of the produced comics. We discuss how our findings inform the future design of interfaces for computational notebooks and features to support diverse collaborators.2021DKDaYe Kang et al.KAIST, KAISTInteractive Data VisualizationData StorytellingCHI
SchemaBoard: Supporting Correct Assembly of Schematic Circuits using Dynamic In-Situ VisualizationAssembling circuits on breadboards using reference designs is a common activity among makers. While tools like Fritzing offer a simplified visualization of how components and wires are connected, such pictorial depictions of circuits are rare in formal educational materials and the vast bulk of online technical documentation. Electronic schematics are more common but are perceived as challenging and confusing by novice makers. To improve access to schematics, we propose SchemaBoard, a system for assisting makers in assembling and inspecting circuits on breadboards from schematic source materials. SchemaBoard uses an LED matrix integrated underneath a working breadboard to visualize via light patterns where and how components should be placed, or to highlight elements of circuit topology such as electrical nets and connected pins. This paper presents a formative study with 16 makers, the SchemaBoard system, and a summative evaluation with an additional 16 users. Results indicate that SchemaBoard is effective in reducing both the time and the number of errors associated with building a circuit based on a reference schematic, and for inspecting the circuit for correctness after its assembly.2020YKYoonji Kim et al.Circuit Making & Hardware PrototypingUser Research Methods (Interviews, Surveys, Observation)UIST
ElaStick: A Handheld Variable Stiffness Display for Rendering Dynamic Haptic Response of Flexible ObjectHaptic controllers have an important role in providing rich and immersive Virtual Reality (VR) experiences. While previous works have succeeded in creating handheld devices that simulate dynamic properties of rigid objects, such as weight, shape, and movement, recreating the behavior of fexible objects with different stiffness using ungrounded controllers remains an open challenge. In this paper we present ElaStick, a variablestiffness controller that simulates the dynamic response resulting from shaking or swinging fexible virtual objects. This is achieved by dynamically changing the stiffness of four custom elastic tendons along a joint that effectively increase and reduce the overall stiffness of a perceived object in 2-DoF. We show that with the proposed mechanism, we can render stiffness with high precision and granularity in a continuous range between 10.8 and 71.5Nmm/◦. We estimate the threshold of the human perception of stiffness with a just-noticeable difference (JND) study and investigate the levels of immersion, realism and enjoyment using a VR application.2020NRNeung Ryu et al.Force Feedback & Pseudo-Haptic WeightFull-Body Interaction & Embodied InputImmersion & Presence ResearchUIST
BodyPrinter: Fabricating Circuits Directly on the Skin at Arbitrary Locations Using a Wearable Compact PlotterOn-body electronics and sensors offer the opportunity to seamlessly augment the human with computing power. Accordingly, numerous previous work investigated methods that exploit conductive materials and fexible substrates to fabricate circuits in the form of wearable devices, stretchable patches, and stickers that can be attached to the skin. For all these methods, the fabrication process involves several manual steps, such as designing the circuit in software, constructing conductive patches, and manually placing these physical patches on the body. In contrast, in this work, we propose to fabricate electronics directly on the skin. We present BodyPrinter, a wearable conductive-ink deposition machine, that prints fexible electronics directly on the body using skin-safe conductive ink. The paper describes our system in detail and, through a series of examples and a technical evaluation, we show how direct on-body fabrication of electronic circuits and sensors can further enhance the human body.2020YCYoungkyung Choi et al.Electrical Muscle Stimulation (EMS)Haptic WearablesOn-Skin Display & On-Skin InputUIST
VirtualComponent: A Mixed-Reality Tool for Designing and Tuning Breadboarded CircuitsPrototyping electronic circuits is an increasingly popular activity, supported by researchers, who develop toolkits to improve the design, debugging, and fabrication of electronics. Although past work mainly dealt with circuit topology, in this paper we propose a system for determining or tuning the values of the circuit components. Based on the results of a formative study with seventeen makers, we designed VirtualComponent, a mixed-reality tool that allows users to digitally place electronic components on a real breadboard, tune their values in software, and see these changes applied to the physical circuit in real-time. VirtualComponent is composed of a set of plug-and-play modules containing banks of components, and a custom breadboard managing the connections and components' values. Through demonstrations and the results of an informal study with twelve makers, we show that VirtualComponent is easy to use and allows users to test components' value configurations with little effort.2019YKYoonji Kim et al.Korea Advanced Institute of Science and TechnologyDesktop 3D Printing & Personal FabricationCircuit Making & Hardware PrototypingCHI
Aero-plane: a Handheld Force-Feedback Device that Renders Weight Motion Illusion on a Virtual 2D PlaneForce feedback is said to be the next frontier in virtual reality (VR). Recently, with consumers pushing forward with unthethered VR, researchers turned away from solutions based on bulky hardware (e.g., exoskeletons and robotic arms) and started exploring smaller portable or wearable devices. However, when it comes to rendering inertial forces, such as when moving a heavy object around or when interacting with objects with unique mass properties, current ungrounded force feedback devices are unable to provide quick weight shifting sensations that can realistically simulate weight changes over 2D surfaces. In this paper we introduce Aero-plane, a force-feedback handheld controller based on two miniature jet-propellers that can render shifting weights of up to 14 N within 0.3 seconds. Through two user studies we: (1) characterize the users’ ability to perceive and correctly recognize different motion paths on a virtual plane while using our device; and, (2) tested the level of realism and immersion of the controller when used in two VR applications (a rolling ball on a plane, and using kitchen tools of different shape and size). Lastly, we present a set of applications that further explore different usage cases and alternative form-factors for our device.2019SJSeungwoo Je et al.Force Feedback & Pseudo-Haptic WeightUIST
PokeRing: Notifications by Poking Around the FingerSmart-rings are ideal for subtle and always-available haptic notifications due to their direct contact with the skin. Previous researchers have highlighted the feasibility of haptic technology in smart-rings and their promise in delivering noticeable stimulations by poking a limited set of planar locations on the finger. However, the full potential of poking as a mechanism to deliver richer and more expressive information on the finger is overlooked. With three studies and a total of 76 participants, we informed the design of PokeRing, a smart-ring capable of delivering information via stimulating eight different locations around the index finger’s proximal phalanx. We report our evaluation of the performance of PokeRing in semi-realistic wearable conditions, (standing and walking), and its effective usage for information transfer with twenty-one spatio-temporal patterns designed by six interaction designers in a workshop. Finally, we present three applications that exploit PokeRing’s notification usages.2018SJSeungwoo Je et al.KAISTIn-Vehicle Haptic, Audio & Multimodal FeedbackHaptic WearablesNotification & Interruption ManagementCHI